Tobias Bachmann, Wabe W. Koelmans, et al.
Nanotechnology
In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.
Tobias Bachmann, Wabe W. Koelmans, et al.
Nanotechnology
Abu Sebastian, Manuel Le Gallo, et al.
Journal of Applied Physics
Chuteng Zhou, Fernando Garcia Redondo, et al.
IEEE Micro
A. Athmanathan, Daniel Krebs, et al.
SISPAD 2015