Cargando…
Toward Software-Equivalent Accuracy on Transformer-Based Deep Neural Networks With Analog Memory Devices
Recent advances in deep learning have been driven by ever-increasing model sizes, with networks growing to millions or even billions of parameters. Such enormous models call for fast and energy-efficient hardware accelerators. We study the potential of Analog AI accelerators based on Non-Volatile Me...
Autores principales: | , , , , , , , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8287521/ https://www.ncbi.nlm.nih.gov/pubmed/34290595 http://dx.doi.org/10.3389/fncom.2021.675741 |