Cargando…

Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices

Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Me...

Descripción completa

Detalles Bibliográficos
Autores principales: Spoon, Katie, Tsai, Hsinyu, Chen, An, Rasch, Malte J., Ambrogio, Stefano, Mackin, Charles, Fasoli, Andrea, Friz, Alexander M., Narayanan, Pritish, Stanisavljevic, Milos, Burr, Geoffrey W.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8287521/
https://www.ncbi.nlm.nih.gov/pubmed/34290595
http://dx.doi.org/10.3389/fncom.2021.675741