Cargando…

DisSAGD: A Distributed Parameter Update Scheme Based on Variance Reduction

Machine learning models often converge slowly and are unstable due to the significant variance of random data when using a sample estimate gradient in SGD. To increase the speed of convergence and improve stability, a distributed SGD algorithm based on variance reduction, named DisSAGD, is proposed...

Descripción completa

Detalles Bibliográficos
Autores principales: Pan, Haijie, Zheng, Lirong
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2021
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8347539/
https://www.ncbi.nlm.nih.gov/pubmed/34372361
http://dx.doi.org/10.3390/s21155124