Cargando…
A Cluster-Driven Adaptive Training Approach for Federated Learning
Federated learning (FL) is a promising collaborative learning approach in edge computing, reducing communication costs and addressing the data privacy concerns of traditional cloud-based training. Owing to this, diverse studies have been conducted to distribute FL into industry. However, there still...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9502390/ https://www.ncbi.nlm.nih.gov/pubmed/36146408 http://dx.doi.org/10.3390/s22187061 |
_version_ | 1784795693299269632 |
---|---|
author | Jeong, Younghwan Kim, Taeyoon |
author_facet | Jeong, Younghwan Kim, Taeyoon |
author_sort | Jeong, Younghwan |
collection | PubMed |
description | Federated learning (FL) is a promising collaborative learning approach in edge computing, reducing communication costs and addressing the data privacy concerns of traditional cloud-based training. Owing to this, diverse studies have been conducted to distribute FL into industry. However, there still remain the practical issues of FL to be solved (e.g., handling non-IID data and stragglers) for an actual implementation of FL. To address these issues, in this paper, we propose a cluster-driven adaptive training approach (CATA-Fed) to enhance the performance of FL training in a practical environment. CATA-Fed employs adaptive training during the local model updates to enhance the efficiency of training, reducing the waste of time and resources due to the presence of the stragglers and also provides a straggler mitigating scheme, which can reduce the workload of straggling clients. In addition to this, CATA-Fed clusters the clients considering the data size and selects the training participants within a cluster to reduce the magnitude differences of local gradients collected in the global model update under a statistical heterogeneous condition (e.g., non-IID data). During this client selection process, a proportional fair scheduling is employed for securing the data diversity as well as balancing the load of clients. We conduct extensive experiments using three benchmark datasets (MNIST, Fashion-MNIST, and CIFAR-10), and the results show that CATA-Fed outperforms the previous FL schemes (FedAVG, FedProx, and TiFL) with regard to the training speed and test accuracy under the diverse FL conditions. |
format | Online Article Text |
id | pubmed-9502390 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2022 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-95023902022-09-24 A Cluster-Driven Adaptive Training Approach for Federated Learning Jeong, Younghwan Kim, Taeyoon Sensors (Basel) Article Federated learning (FL) is a promising collaborative learning approach in edge computing, reducing communication costs and addressing the data privacy concerns of traditional cloud-based training. Owing to this, diverse studies have been conducted to distribute FL into industry. However, there still remain the practical issues of FL to be solved (e.g., handling non-IID data and stragglers) for an actual implementation of FL. To address these issues, in this paper, we propose a cluster-driven adaptive training approach (CATA-Fed) to enhance the performance of FL training in a practical environment. CATA-Fed employs adaptive training during the local model updates to enhance the efficiency of training, reducing the waste of time and resources due to the presence of the stragglers and also provides a straggler mitigating scheme, which can reduce the workload of straggling clients. In addition to this, CATA-Fed clusters the clients considering the data size and selects the training participants within a cluster to reduce the magnitude differences of local gradients collected in the global model update under a statistical heterogeneous condition (e.g., non-IID data). During this client selection process, a proportional fair scheduling is employed for securing the data diversity as well as balancing the load of clients. We conduct extensive experiments using three benchmark datasets (MNIST, Fashion-MNIST, and CIFAR-10), and the results show that CATA-Fed outperforms the previous FL schemes (FedAVG, FedProx, and TiFL) with regard to the training speed and test accuracy under the diverse FL conditions. MDPI 2022-09-18 /pmc/articles/PMC9502390/ /pubmed/36146408 http://dx.doi.org/10.3390/s22187061 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Jeong, Younghwan Kim, Taeyoon A Cluster-Driven Adaptive Training Approach for Federated Learning |
title | A Cluster-Driven Adaptive Training Approach for Federated Learning |
title_full | A Cluster-Driven Adaptive Training Approach for Federated Learning |
title_fullStr | A Cluster-Driven Adaptive Training Approach for Federated Learning |
title_full_unstemmed | A Cluster-Driven Adaptive Training Approach for Federated Learning |
title_short | A Cluster-Driven Adaptive Training Approach for Federated Learning |
title_sort | cluster-driven adaptive training approach for federated learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9502390/ https://www.ncbi.nlm.nih.gov/pubmed/36146408 http://dx.doi.org/10.3390/s22187061 |
work_keys_str_mv | AT jeongyounghwan aclusterdrivenadaptivetrainingapproachforfederatedlearning AT kimtaeyoon aclusterdrivenadaptivetrainingapproachforfederatedlearning AT jeongyounghwan clusterdrivenadaptivetrainingapproachforfederatedlearning AT kimtaeyoon clusterdrivenadaptivetrainingapproachforfederatedlearning |