Cargando…
Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
Adaptive algorithms are widely used because of their fast convergence rate for training deep neural networks (DNNs). However, the training cost becomes prohibitively expensive due to the computation of the full gradient when training complicated DNN. To reduce the computational cost, we present a st...
Autores principales: | Liu, Jianghui, Li, Baozhu, Zhou, Yangfan, Zhao, Xuhui, Zhu, Junlong, Zhang, Mingchuan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Hindawi
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9184190/ https://www.ncbi.nlm.nih.gov/pubmed/35694581 http://dx.doi.org/10.1155/2022/9337209 |
Ejemplares similares
-
FedADT: An Adaptive Method Based on Derivative Term for Federated Learning
por: Gao, Huimin, et al.
Publicado: (2023) -
Accelerating DNN Training Through Selective Localized Learning
por: Krithivasan, Sarada, et al.
Publicado: (2022) -
DNN-MVL: DNN-Multi-View-Learning-Based Recover Block Missing Data in a Dam Safety Monitoring System
por: Mao, Yingchi, et al.
Publicado: (2019) -
Adam and the Ants: On the Influence of the Optimization Algorithm on the Detectability of DNN Watermarks
por: Cortiñas-Lorenzo, Betty, et al.
Publicado: (2020) -
Intelligent Sports Video Classification Based on Deep Neural Network (DNN) Algorithm and Transfer Learning
por: Guo, Xiaoping
Publicado: (2021)