Cargando…

Gradient Decomposition Methods for Training Neural Networks With Non-ideal Synaptic Devices

While promising for high-capacity machine learning accelerators, memristor devices have non-idealities that prevent software-equivalent accuracies when used for online training. This work uses a combination of Mini-Batch Gradient Descent (MBGD) to average gradients, stochastic rounding to avoid vani...

Descripción completa

Detalles Bibliográficos
Autores principales: Zhao, Junyun, Huang, Siyuan, Yousuf, Osama, Gao, Yutong, Hoskins, Brian D., Adam, Gina C.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8645649/
https://www.ncbi.nlm.nih.gov/pubmed/34880721
http://dx.doi.org/10.3389/fnins.2021.749811