Cargando…
Robust high-dimensional memory-augmented neural networks
Traditional neural networks require enormous amounts of data to build their complex mappings during a slow training procedure that hinders their abilities for relearning and adapting to new data. Memory-augmented neural networks enhance neural networks with an explicit memory to overcome these issue...
Autores principales: | , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8084980/ https://www.ncbi.nlm.nih.gov/pubmed/33927202 http://dx.doi.org/10.1038/s41467-021-22364-0 |
_version_ | 1783686255077228544 |
---|---|
author | Karunaratne, Geethan Schmuck, Manuel Le Gallo, Manuel Cherubini, Giovanni Benini, Luca Sebastian, Abu Rahimi, Abbas |
author_facet | Karunaratne, Geethan Schmuck, Manuel Le Gallo, Manuel Cherubini, Giovanni Benini, Luca Sebastian, Abu Rahimi, Abbas |
author_sort | Karunaratne, Geethan |
collection | PubMed |
description | Traditional neural networks require enormous amounts of data to build their complex mappings during a slow training procedure that hinders their abilities for relearning and adapting to new data. Memory-augmented neural networks enhance neural networks with an explicit memory to overcome these issues. Access to this explicit memory, however, occurs via soft read and write operations involving every individual memory entry, resulting in a bottleneck when implemented using the conventional von Neumann computer architecture. To overcome this bottleneck, we propose a robust architecture that employs a computational memory unit as the explicit memory performing analog in-memory computation on high-dimensional (HD) vectors, while closely matching 32-bit software-equivalent accuracy. This is achieved by a content-based attention mechanism that represents unrelated items in the computational memory with uncorrelated HD vectors, whose real-valued components can be readily approximated by binary, or bipolar components. Experimental results demonstrate the efficacy of our approach on few-shot image classification tasks on the Omniglot dataset using more than 256,000 phase-change memory devices. Our approach effectively merges the richness of deep neural network representations with HD computing that paves the way for robust vector-symbolic manipulations applicable in reasoning, fusion, and compression. |
format | Online Article Text |
id | pubmed-8084980 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2021 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-80849802021-05-11 Robust high-dimensional memory-augmented neural networks Karunaratne, Geethan Schmuck, Manuel Le Gallo, Manuel Cherubini, Giovanni Benini, Luca Sebastian, Abu Rahimi, Abbas Nat Commun Article Traditional neural networks require enormous amounts of data to build their complex mappings during a slow training procedure that hinders their abilities for relearning and adapting to new data. Memory-augmented neural networks enhance neural networks with an explicit memory to overcome these issues. Access to this explicit memory, however, occurs via soft read and write operations involving every individual memory entry, resulting in a bottleneck when implemented using the conventional von Neumann computer architecture. To overcome this bottleneck, we propose a robust architecture that employs a computational memory unit as the explicit memory performing analog in-memory computation on high-dimensional (HD) vectors, while closely matching 32-bit software-equivalent accuracy. This is achieved by a content-based attention mechanism that represents unrelated items in the computational memory with uncorrelated HD vectors, whose real-valued components can be readily approximated by binary, or bipolar components. Experimental results demonstrate the efficacy of our approach on few-shot image classification tasks on the Omniglot dataset using more than 256,000 phase-change memory devices. Our approach effectively merges the richness of deep neural network representations with HD computing that paves the way for robust vector-symbolic manipulations applicable in reasoning, fusion, and compression. Nature Publishing Group UK 2021-04-29 /pmc/articles/PMC8084980/ /pubmed/33927202 http://dx.doi.org/10.1038/s41467-021-22364-0 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . |
spellingShingle | Article Karunaratne, Geethan Schmuck, Manuel Le Gallo, Manuel Cherubini, Giovanni Benini, Luca Sebastian, Abu Rahimi, Abbas Robust high-dimensional memory-augmented neural networks |
title | Robust high-dimensional memory-augmented neural networks |
title_full | Robust high-dimensional memory-augmented neural networks |
title_fullStr | Robust high-dimensional memory-augmented neural networks |
title_full_unstemmed | Robust high-dimensional memory-augmented neural networks |
title_short | Robust high-dimensional memory-augmented neural networks |
title_sort | robust high-dimensional memory-augmented neural networks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8084980/ https://www.ncbi.nlm.nih.gov/pubmed/33927202 http://dx.doi.org/10.1038/s41467-021-22364-0 |
work_keys_str_mv | AT karunaratnegeethan robusthighdimensionalmemoryaugmentedneuralnetworks AT schmuckmanuel robusthighdimensionalmemoryaugmentedneuralnetworks AT legallomanuel robusthighdimensionalmemoryaugmentedneuralnetworks AT cherubinigiovanni robusthighdimensionalmemoryaugmentedneuralnetworks AT beniniluca robusthighdimensionalmemoryaugmentedneuralnetworks AT sebastianabu robusthighdimensionalmemoryaugmentedneuralnetworks AT rahimiabbas robusthighdimensionalmemoryaugmentedneuralnetworks |