Cargando…

Accurate deep neural network inference using computational phase-change memory

In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights...

Descripción completa

Detalles Bibliográficos
Autores principales: Joshi, Vinay, Le Gallo, Manuel, Haefeli, Simon, Boybat, Irem, Nandakumar, S. R., Piveteau, Christophe, Dazzi, Martino, Rajendran, Bipin, Sebastian, Abu, Eleftheriou, Evangelos
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7235046/
https://www.ncbi.nlm.nih.gov/pubmed/32424184
http://dx.doi.org/10.1038/s41467-020-16108-9
Descripción
Sumario:In-memory computing using resistive memory devices is a promising non-von Neumann approach for making energy-efficient deep learning inference hardware. However, due to device variability and noise, the network needs to be trained in a specific way so that transferring the digitally trained weights to the analog resistive memory devices will not result in significant loss of accuracy. Here, we introduce a methodology to train ResNet-type convolutional neural networks that results in no appreciable accuracy loss when transferring weights to phase-change memory (PCM) devices. We also propose a compensation technique that exploits the batch normalization parameters to improve the accuracy retention over time. We achieve a classification accuracy of 93.7% on CIFAR-10 and a top-1 accuracy of 71.6% on ImageNet benchmarks after mapping the trained weights to PCM. Our hardware results on CIFAR-10 with ResNet-32 demonstrate an accuracy above 93.5% retained over a one-day period, where each of the 361,722 synaptic weights is programmed on just two PCM devices organized in a differential configuration.