Cargando…
DisSAGD: A Distributed Parameter Update Scheme Based on Variance Reduction
Machine learning models often converge slowly and are unstable due to the significant variance of random data when using a sample estimate gradient in SGD. To increase the speed of convergence and improve stability, a distributed SGD algorithm based on variance reduction, named DisSAGD, is proposed...
Autores principales: | Pan, Haijie, Zheng, Lirong |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8347539/ https://www.ncbi.nlm.nih.gov/pubmed/34372361 http://dx.doi.org/10.3390/s21155124 |
Ejemplares similares
-
Bitumen Recovery Performance of SAGD and Butane- and
Hexane-Aided SAGD in the Presence of Shale Barriers
por: Kumar, Ashish, et al.
Publicado: (2022) -
SAGD: a comprehensive sex-associated gene database from transcriptomes
por: Shi, Meng-Wei, et al.
Publicado: (2019) -
Visualization Experimental Study on Well Spacing Optimization
of SAGD with a Combination of Vertical and Horizontal Wells
por: Tao, Lei, et al.
Publicado: (2021) -
CRISPR-Cas9 Mediated Knockout of SagD Gene for Overexpression of Streptokinase in Streptococcus equisimilis
por: Chaudhari, Armi M., et al.
Publicado: (2022) -
LCA model validation of SAGD facilities with real operation data as a collaborative example between model developers and industry
por: Masnadi, Mohammad S., et al.
Publicado: (2022)