Cargando…

An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations

In-memory computing may enable multiply-accumulate (MAC) operations, which are the primary calculations used in artificial intelligence (AI). Performing MAC operations with high capacity in a small area with high energy efficiency remains a challenge. In this work, we propose a circuit architecture...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Yin, Tang, Hongwei, Xie, Yufeng, Chen, Xinyu, Ma, Shunli, Sun, Zhengzong, Sun, Qingqing, Chen, Lin, Zhu, Hao, Wan, Jing, Xu, Zihan, Zhang, David Wei, Zhou, Peng, Bao, Wenzhong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8184885/
https://www.ncbi.nlm.nih.gov/pubmed/34099710
http://dx.doi.org/10.1038/s41467-021-23719-3
_version_ 1783704669380411392
author Wang, Yin
Tang, Hongwei
Xie, Yufeng
Chen, Xinyu
Ma, Shunli
Sun, Zhengzong
Sun, Qingqing
Chen, Lin
Zhu, Hao
Wan, Jing
Xu, Zihan
Zhang, David Wei
Zhou, Peng
Bao, Wenzhong
author_facet Wang, Yin
Tang, Hongwei
Xie, Yufeng
Chen, Xinyu
Ma, Shunli
Sun, Zhengzong
Sun, Qingqing
Chen, Lin
Zhu, Hao
Wan, Jing
Xu, Zihan
Zhang, David Wei
Zhou, Peng
Bao, Wenzhong
author_sort Wang, Yin
collection PubMed
description In-memory computing may enable multiply-accumulate (MAC) operations, which are the primary calculations used in artificial intelligence (AI). Performing MAC operations with high capacity in a small area with high energy efficiency remains a challenge. In this work, we propose a circuit architecture that integrates monolayer MoS(2) transistors in a two-transistor–one-capacitor (2T-1C) configuration. In this structure, the memory portion is similar to a 1T-1C Dynamic Random Access Memory (DRAM) so that theoretically the cycling endurance and erase/write speed inherit the merits of DRAM. Besides, the ultralow leakage current of the MoS(2) transistor enables the storage of multi-level voltages on the capacitor with a long retention time. The electrical characteristics of a single MoS(2) transistor also allow analog computation by multiplying the drain voltage by the stored voltage on the capacitor. The sum-of-product is then obtained by converging the currents from multiple 2T-1C units. Based on our experiment results, a neural network is ex-situ trained for image recognition with 90.3% accuracy. In the future, such 2T-1C units can potentially be integrated into three-dimensional (3D) circuits with dense logic and memory layers for low power in-situ training of neural networks in hardware.
format Online
Article
Text
id pubmed-8184885
institution National Center for Biotechnology Information
language English
publishDate 2021
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-81848852021-06-09 An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations Wang, Yin Tang, Hongwei Xie, Yufeng Chen, Xinyu Ma, Shunli Sun, Zhengzong Sun, Qingqing Chen, Lin Zhu, Hao Wan, Jing Xu, Zihan Zhang, David Wei Zhou, Peng Bao, Wenzhong Nat Commun Article In-memory computing may enable multiply-accumulate (MAC) operations, which are the primary calculations used in artificial intelligence (AI). Performing MAC operations with high capacity in a small area with high energy efficiency remains a challenge. In this work, we propose a circuit architecture that integrates monolayer MoS(2) transistors in a two-transistor–one-capacitor (2T-1C) configuration. In this structure, the memory portion is similar to a 1T-1C Dynamic Random Access Memory (DRAM) so that theoretically the cycling endurance and erase/write speed inherit the merits of DRAM. Besides, the ultralow leakage current of the MoS(2) transistor enables the storage of multi-level voltages on the capacitor with a long retention time. The electrical characteristics of a single MoS(2) transistor also allow analog computation by multiplying the drain voltage by the stored voltage on the capacitor. The sum-of-product is then obtained by converging the currents from multiple 2T-1C units. Based on our experiment results, a neural network is ex-situ trained for image recognition with 90.3% accuracy. In the future, such 2T-1C units can potentially be integrated into three-dimensional (3D) circuits with dense logic and memory layers for low power in-situ training of neural networks in hardware. Nature Publishing Group UK 2021-06-07 /pmc/articles/PMC8184885/ /pubmed/34099710 http://dx.doi.org/10.1038/s41467-021-23719-3 Text en © The Author(s) 2021 https://creativecommons.org/licenses/by/4.0/Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Wang, Yin
Tang, Hongwei
Xie, Yufeng
Chen, Xinyu
Ma, Shunli
Sun, Zhengzong
Sun, Qingqing
Chen, Lin
Zhu, Hao
Wan, Jing
Xu, Zihan
Zhang, David Wei
Zhou, Peng
Bao, Wenzhong
An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title_full An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title_fullStr An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title_full_unstemmed An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title_short An in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
title_sort in-memory computing architecture based on two-dimensional semiconductors for multiply-accumulate operations
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8184885/
https://www.ncbi.nlm.nih.gov/pubmed/34099710
http://dx.doi.org/10.1038/s41467-021-23719-3
work_keys_str_mv AT wangyin aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT tanghongwei aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT xieyufeng aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT chenxinyu aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT mashunli aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT sunzhengzong aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT sunqingqing aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT chenlin aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhuhao aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT wanjing aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT xuzihan aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhangdavidwei aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhoupeng aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT baowenzhong aninmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT wangyin inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT tanghongwei inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT xieyufeng inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT chenxinyu inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT mashunli inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT sunzhengzong inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT sunqingqing inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT chenlin inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhuhao inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT wanjing inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT xuzihan inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhangdavidwei inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT zhoupeng inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations
AT baowenzhong inmemorycomputingarchitecturebasedontwodimensionalsemiconductorsformultiplyaccumulateoperations