Cargando…
FedADT: An Adaptive Method Based on Derivative Term for Federated Learning
Federated learning is served as a novel distributed training framework that enables multiple clients of the internet of things to collaboratively train a global model while the data remains local. However, the implement of federated learning faces many problems in practice, such as the large number...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10347066/ https://www.ncbi.nlm.nih.gov/pubmed/37447882 http://dx.doi.org/10.3390/s23136034 |
_version_ | 1785073461771632640 |
---|---|
author | Gao, Huimin Wu, Qingtao Zhao, Xuhui Zhu, Junlong Zhang, Mingchuan |
author_facet | Gao, Huimin Wu, Qingtao Zhao, Xuhui Zhu, Junlong Zhang, Mingchuan |
author_sort | Gao, Huimin |
collection | PubMed |
description | Federated learning is served as a novel distributed training framework that enables multiple clients of the internet of things to collaboratively train a global model while the data remains local. However, the implement of federated learning faces many problems in practice, such as the large number of training for convergence due to the size of model and the lack of adaptivity by the stochastic gradient-based update at the client side. Meanwhile, it is sensitive to noise during the optimization process that can affect the performance of the final model. For these reasons, we propose Federated Adaptive learning based on Derivative Term, called FedADT in this paper, which incorporates adaptive step size and difference of gradient in the update of local model. To further reduce the influence of noise on the derivative term that is estimated by difference of gradient, we use moving average decay on the derivative term. Moreover, we analyze the convergence performance of the proposed algorithm for non-convex objective function, i.e., the convergence rate of [Formula: see text] can be achieved by choosing appropriate hyper-parameters, where n is the number of clients and T is the number of iterations, respectively. Finally, various experiments for the image classification task are conducted by training widely used convolutional neural network on MNIST and Fashion MNIST datasets to verify the effectiveness of FedADT. In addition, the receiver operating characteristic curve is used to display the result of the proposed algorithm by predicting the categories of clothing on the Fashion MNIST dataset. |
format | Online Article Text |
id | pubmed-10347066 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-103470662023-07-15 FedADT: An Adaptive Method Based on Derivative Term for Federated Learning Gao, Huimin Wu, Qingtao Zhao, Xuhui Zhu, Junlong Zhang, Mingchuan Sensors (Basel) Article Federated learning is served as a novel distributed training framework that enables multiple clients of the internet of things to collaboratively train a global model while the data remains local. However, the implement of federated learning faces many problems in practice, such as the large number of training for convergence due to the size of model and the lack of adaptivity by the stochastic gradient-based update at the client side. Meanwhile, it is sensitive to noise during the optimization process that can affect the performance of the final model. For these reasons, we propose Federated Adaptive learning based on Derivative Term, called FedADT in this paper, which incorporates adaptive step size and difference of gradient in the update of local model. To further reduce the influence of noise on the derivative term that is estimated by difference of gradient, we use moving average decay on the derivative term. Moreover, we analyze the convergence performance of the proposed algorithm for non-convex objective function, i.e., the convergence rate of [Formula: see text] can be achieved by choosing appropriate hyper-parameters, where n is the number of clients and T is the number of iterations, respectively. Finally, various experiments for the image classification task are conducted by training widely used convolutional neural network on MNIST and Fashion MNIST datasets to verify the effectiveness of FedADT. In addition, the receiver operating characteristic curve is used to display the result of the proposed algorithm by predicting the categories of clothing on the Fashion MNIST dataset. MDPI 2023-06-29 /pmc/articles/PMC10347066/ /pubmed/37447882 http://dx.doi.org/10.3390/s23136034 Text en © 2023 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Gao, Huimin Wu, Qingtao Zhao, Xuhui Zhu, Junlong Zhang, Mingchuan FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title | FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title_full | FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title_fullStr | FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title_full_unstemmed | FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title_short | FedADT: An Adaptive Method Based on Derivative Term for Federated Learning |
title_sort | fedadt: an adaptive method based on derivative term for federated learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10347066/ https://www.ncbi.nlm.nih.gov/pubmed/37447882 http://dx.doi.org/10.3390/s23136034 |
work_keys_str_mv | AT gaohuimin fedadtanadaptivemethodbasedonderivativetermforfederatedlearning AT wuqingtao fedadtanadaptivemethodbasedonderivativetermforfederatedlearning AT zhaoxuhui fedadtanadaptivemethodbasedonderivativetermforfederatedlearning AT zhujunlong fedadtanadaptivemethodbasedonderivativetermforfederatedlearning AT zhangmingchuan fedadtanadaptivemethodbasedonderivativetermforfederatedlearning |