Cargando…
Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks
Shallow feed-forward networks are incapable of addressing complex tasks such as natural language processing that require learning of temporal signals. To address these requirements, we need deep neuromorphic architectures with recurrent connections such as deep recurrent neural networks. However, th...
Autores principales: | , , , , , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7316775/ https://www.ncbi.nlm.nih.gov/pubmed/32587241 http://dx.doi.org/10.1038/s41467-020-16985-0 |
_version_ | 1783550493301145600 |
---|---|
author | John, Rohit Abraham Acharya, Jyotibdha Zhu, Chao Surendran, Abhijith Bose, Sumon Kumar Chaturvedi, Apoorva Tiwari, Nidhi Gao, Yang He, Yongmin Zhang, Keke K. Xu, Manzhang Leong, Wei Lin Liu, Zheng Basu, Arindam Mathews, Nripan |
author_facet | John, Rohit Abraham Acharya, Jyotibdha Zhu, Chao Surendran, Abhijith Bose, Sumon Kumar Chaturvedi, Apoorva Tiwari, Nidhi Gao, Yang He, Yongmin Zhang, Keke K. Xu, Manzhang Leong, Wei Lin Liu, Zheng Basu, Arindam Mathews, Nripan |
author_sort | John, Rohit Abraham |
collection | PubMed |
description | Shallow feed-forward networks are incapable of addressing complex tasks such as natural language processing that require learning of temporal signals. To address these requirements, we need deep neuromorphic architectures with recurrent connections such as deep recurrent neural networks. However, the training of such networks demand very high precision of weights, excellent conductance linearity and low write-noise- not satisfied by current memristive implementations. Inspired from optogenetics, here we report a neuromorphic computing platform comprised of photo-excitable neuristors capable of in-memory computations across 980 addressable states with a high signal-to-noise ratio of 77. The large linear dynamic range, low write noise and selective excitability allows high fidelity opto-electronic transfer of weights with a two-shot write scheme, while electrical in-memory inference provides energy efficiency. This method enables implementing a memristive deep recurrent neural network with twelve trainable layers with more than a million parameters to recognize spoken commands with >90% accuracy. |
format | Online Article Text |
id | pubmed-7316775 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | Nature Publishing Group UK |
record_format | MEDLINE/PubMed |
spelling | pubmed-73167752020-06-30 Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks John, Rohit Abraham Acharya, Jyotibdha Zhu, Chao Surendran, Abhijith Bose, Sumon Kumar Chaturvedi, Apoorva Tiwari, Nidhi Gao, Yang He, Yongmin Zhang, Keke K. Xu, Manzhang Leong, Wei Lin Liu, Zheng Basu, Arindam Mathews, Nripan Nat Commun Article Shallow feed-forward networks are incapable of addressing complex tasks such as natural language processing that require learning of temporal signals. To address these requirements, we need deep neuromorphic architectures with recurrent connections such as deep recurrent neural networks. However, the training of such networks demand very high precision of weights, excellent conductance linearity and low write-noise- not satisfied by current memristive implementations. Inspired from optogenetics, here we report a neuromorphic computing platform comprised of photo-excitable neuristors capable of in-memory computations across 980 addressable states with a high signal-to-noise ratio of 77. The large linear dynamic range, low write noise and selective excitability allows high fidelity opto-electronic transfer of weights with a two-shot write scheme, while electrical in-memory inference provides energy efficiency. This method enables implementing a memristive deep recurrent neural network with twelve trainable layers with more than a million parameters to recognize spoken commands with >90% accuracy. Nature Publishing Group UK 2020-06-25 /pmc/articles/PMC7316775/ /pubmed/32587241 http://dx.doi.org/10.1038/s41467-020-16985-0 Text en © The Author(s) 2020 Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/. |
spellingShingle | Article John, Rohit Abraham Acharya, Jyotibdha Zhu, Chao Surendran, Abhijith Bose, Sumon Kumar Chaturvedi, Apoorva Tiwari, Nidhi Gao, Yang He, Yongmin Zhang, Keke K. Xu, Manzhang Leong, Wei Lin Liu, Zheng Basu, Arindam Mathews, Nripan Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title_full | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title_fullStr | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title_full_unstemmed | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title_short | Optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
title_sort | optogenetics inspired transition metal dichalcogenide neuristors for in-memory deep recurrent neural networks |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7316775/ https://www.ncbi.nlm.nih.gov/pubmed/32587241 http://dx.doi.org/10.1038/s41467-020-16985-0 |
work_keys_str_mv | AT johnrohitabraham optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT acharyajyotibdha optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT zhuchao optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT surendranabhijith optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT bosesumonkumar optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT chaturvediapoorva optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT tiwarinidhi optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT gaoyang optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT heyongmin optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT zhangkekek optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT xumanzhang optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT leongweilin optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT liuzheng optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT basuarindam optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks AT mathewsnripan optogeneticsinspiredtransitionmetaldichalcogenideneuristorsforinmemorydeeprecurrentneuralnetworks |