Cargando…

Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm

Adaptive algorithms are widely used because of their fast convergence rate for training deep neural networks (DNNs). However, the training cost becomes prohibitively expensive due to the computation of the full gradient when training complicated DNN. To reduce the computational cost, we present a st...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Jianghui, Li, Baozhu, Zhou, Yangfan, Zhao, Xuhui, Zhu, Junlong, Zhang, Mingchuan
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Hindawi 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9184190/
https://www.ncbi.nlm.nih.gov/pubmed/35694581
http://dx.doi.org/10.1155/2022/9337209
_version_ 1784724456850063360
author Liu, Jianghui
Li, Baozhu
Zhou, Yangfan
Zhao, Xuhui
Zhu, Junlong
Zhang, Mingchuan
author_facet Liu, Jianghui
Li, Baozhu
Zhou, Yangfan
Zhao, Xuhui
Zhu, Junlong
Zhang, Mingchuan
author_sort Liu, Jianghui
collection PubMed
description Adaptive algorithms are widely used because of their fast convergence rate for training deep neural networks (DNNs). However, the training cost becomes prohibitively expensive due to the computation of the full gradient when training complicated DNN. To reduce the computational cost, we present a stochastic block adaptive gradient online training algorithm in this study, called SBAG. In this algorithm, stochastic block coordinate descent and the adaptive learning rate are utilized at each iteration. We also prove that the regret bound of [Formula: see text] can be achieved via SBAG, in which T is a time horizon. In addition, we use SBAG to train ResNet-34 and DenseNet-121 on CIFAR-10, respectively. The results demonstrate that SBAG has better training speed and generalized ability than other existing training methods.
format Online
Article
Text
id pubmed-9184190
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Hindawi
record_format MEDLINE/PubMed
spelling pubmed-91841902022-06-10 Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm Liu, Jianghui Li, Baozhu Zhou, Yangfan Zhao, Xuhui Zhu, Junlong Zhang, Mingchuan Comput Intell Neurosci Research Article Adaptive algorithms are widely used because of their fast convergence rate for training deep neural networks (DNNs). However, the training cost becomes prohibitively expensive due to the computation of the full gradient when training complicated DNN. To reduce the computational cost, we present a stochastic block adaptive gradient online training algorithm in this study, called SBAG. In this algorithm, stochastic block coordinate descent and the adaptive learning rate are utilized at each iteration. We also prove that the regret bound of [Formula: see text] can be achieved via SBAG, in which T is a time horizon. In addition, we use SBAG to train ResNet-34 and DenseNet-121 on CIFAR-10, respectively. The results demonstrate that SBAG has better training speed and generalized ability than other existing training methods. Hindawi 2022-06-02 /pmc/articles/PMC9184190/ /pubmed/35694581 http://dx.doi.org/10.1155/2022/9337209 Text en Copyright © 2022 Jianghui Liu et al. https://creativecommons.org/licenses/by/4.0/This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Article
Liu, Jianghui
Li, Baozhu
Zhou, Yangfan
Zhao, Xuhui
Zhu, Junlong
Zhang, Mingchuan
Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title_full Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title_fullStr Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title_full_unstemmed Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title_short Online Learning for DNN Training: A Stochastic Block Adaptive Gradient Algorithm
title_sort online learning for dnn training: a stochastic block adaptive gradient algorithm
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9184190/
https://www.ncbi.nlm.nih.gov/pubmed/35694581
http://dx.doi.org/10.1155/2022/9337209
work_keys_str_mv AT liujianghui onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm
AT libaozhu onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm
AT zhouyangfan onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm
AT zhaoxuhui onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm
AT zhujunlong onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm
AT zhangmingchuan onlinelearningfordnntrainingastochasticblockadaptivegradientalgorithm