Cargando…

Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation

Federated learning is a framework for multiple devices or institutions, called local clients, to collaboratively train a global model without sharing their data. For federated learning with a central server, an aggregation algorithm integrates model information sent from local clients to update the...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Cen-Jhih, Huang, Pin-Han, Ma, Yi-Ting, Hung, Hung, Huang, Su-Yun
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141408/
https://www.ncbi.nlm.nih.gov/pubmed/35626569
http://dx.doi.org/10.3390/e24050686
_version_ 1784715338996252672
author Li, Cen-Jhih
Huang, Pin-Han
Ma, Yi-Ting
Hung, Hung
Huang, Su-Yun
author_facet Li, Cen-Jhih
Huang, Pin-Han
Ma, Yi-Ting
Hung, Hung
Huang, Su-Yun
author_sort Li, Cen-Jhih
collection PubMed
description Federated learning is a framework for multiple devices or institutions, called local clients, to collaboratively train a global model without sharing their data. For federated learning with a central server, an aggregation algorithm integrates model information sent from local clients to update the parameters for a global model. Sample mean is the simplest and most commonly used aggregation method. However, it is not robust for data with outliers or under the Byzantine problem, where Byzantine clients send malicious messages to interfere with the learning process. Some robust aggregation methods were introduced in literature including marginal median, geometric median and trimmed-mean. In this article, we propose an alternative robust aggregation method, named [Formula: see text]-mean, which is the minimum divergence estimation based on a robust density power divergence. This [Formula: see text]-mean aggregation mitigates the influence of Byzantine clients by assigning fewer weights. This weighting scheme is data-driven and controlled by the [Formula: see text] value. Robustness from the viewpoint of the influence function is discussed and some numerical results are presented.
format Online
Article
Text
id pubmed-9141408
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-91414082022-05-28 Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation Li, Cen-Jhih Huang, Pin-Han Ma, Yi-Ting Hung, Hung Huang, Su-Yun Entropy (Basel) Article Federated learning is a framework for multiple devices or institutions, called local clients, to collaboratively train a global model without sharing their data. For federated learning with a central server, an aggregation algorithm integrates model information sent from local clients to update the parameters for a global model. Sample mean is the simplest and most commonly used aggregation method. However, it is not robust for data with outliers or under the Byzantine problem, where Byzantine clients send malicious messages to interfere with the learning process. Some robust aggregation methods were introduced in literature including marginal median, geometric median and trimmed-mean. In this article, we propose an alternative robust aggregation method, named [Formula: see text]-mean, which is the minimum divergence estimation based on a robust density power divergence. This [Formula: see text]-mean aggregation mitigates the influence of Byzantine clients by assigning fewer weights. This weighting scheme is data-driven and controlled by the [Formula: see text] value. Robustness from the viewpoint of the influence function is discussed and some numerical results are presented. MDPI 2022-05-13 /pmc/articles/PMC9141408/ /pubmed/35626569 http://dx.doi.org/10.3390/e24050686 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Li, Cen-Jhih
Huang, Pin-Han
Ma, Yi-Ting
Hung, Hung
Huang, Su-Yun
Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title_full Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title_fullStr Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title_full_unstemmed Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title_short Robust Aggregation for Federated Learning by Minimum γ-Divergence Estimation
title_sort robust aggregation for federated learning by minimum γ-divergence estimation
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9141408/
https://www.ncbi.nlm.nih.gov/pubmed/35626569
http://dx.doi.org/10.3390/e24050686
work_keys_str_mv AT licenjhih robustaggregationforfederatedlearningbyminimumgdivergenceestimation
AT huangpinhan robustaggregationforfederatedlearningbyminimumgdivergenceestimation
AT mayiting robustaggregationforfederatedlearningbyminimumgdivergenceestimation
AT hunghung robustaggregationforfederatedlearningbyminimumgdivergenceestimation
AT huangsuyun robustaggregationforfederatedlearningbyminimumgdivergenceestimation