Cargando…

High-precision and linear weight updates by subnanosecond pulses in ferroelectric tunnel junction for neuro-inspired computing

The rapid development of neuro-inspired computing demands synaptic devices with ultrafast speed, low power consumption, and multiple non-volatile states, among other features. Here, a high-performance synaptic device is designed and established based on a Ag/PbZr(0.52)Ti(0.48)O(3) (PZT, (111)-orient...

Descripción completa

Detalles Bibliográficos
Autores principales: Luo, Zhen, Wang, Zijian, Guan, Zeyu, Ma, Chao, Zhao, Letian, Liu, Chuanchuan, Sun, Haoyang, Wang, He, Lin, Yue, Jin, Xi, Yin, Yuewei, Li, Xiaoguang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8816951/
https://www.ncbi.nlm.nih.gov/pubmed/35121735
http://dx.doi.org/10.1038/s41467-022-28303-x
Descripción
Sumario:The rapid development of neuro-inspired computing demands synaptic devices with ultrafast speed, low power consumption, and multiple non-volatile states, among other features. Here, a high-performance synaptic device is designed and established based on a Ag/PbZr(0.52)Ti(0.48)O(3) (PZT, (111)-oriented)/Nb:SrTiO(3) ferroelectric tunnel junction (FTJ). The advantages of (111)-oriented PZT (~1.2 nm) include its multiple ferroelectric switching dynamics, ultrafine ferroelectric domains, and small coercive voltage. The FTJ shows high-precision (256 states, 8 bits), reproducible (cycle-to-cycle variation, ~2.06%), linear (nonlinearity <1) and symmetric weight updates, with a good endurance of >10(9) cycles and an ultralow write energy consumption. In particular, manipulations among 150 states are realized under subnanosecond (~630 ps) pulse voltages ≤5 V, and the fastest resistance switching at 300 ps for the FTJs is achieved by voltages <13 V. Based on the experimental performance, the convolutional neural network simulation achieves a high online learning accuracy of ~94.7% for recognizing fashion product images, close to the calculated result of ~95.6% by floating-point-based convolutional neural network software. Interestingly, the FTJ-based neural network is very robust to input image noise, showing potential for practical applications. This work represents an important improvement in FTJs towards building neuro-inspired computing systems.