Cargando…

New Generation Federated Learning

With the development of the Internet of things (IoT), federated learning (FL) has received increasing attention as a distributed machine learning (ML) framework that does not require data exchange. However, current FL frameworks follow an idealized setup in which the task size is fixed and the stora...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Boyuan, Chen, Shengbo, Peng, Zihao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9654996/
https://www.ncbi.nlm.nih.gov/pubmed/36366172
http://dx.doi.org/10.3390/s22218475
_version_ 1784829076275462144
author Li, Boyuan
Chen, Shengbo
Peng, Zihao
author_facet Li, Boyuan
Chen, Shengbo
Peng, Zihao
author_sort Li, Boyuan
collection PubMed
description With the development of the Internet of things (IoT), federated learning (FL) has received increasing attention as a distributed machine learning (ML) framework that does not require data exchange. However, current FL frameworks follow an idealized setup in which the task size is fixed and the storage space is unlimited, which is impossible in the real world. In fact, new classes of these participating clients always emerge over time, and some samples are overwritten or discarded due to storage limitations. We urgently need a new framework to adapt to the dynamic task sequences and strict storage constraints in the real world. Continuous learning or incremental learning is the ultimate goal of deep learning, and we introduce incremental learning into FL to describe a new federated learning framework. New generation federated learning (NGFL) is probably the most desirable framework for FL, in which, in addition to the basic task of training the server, each client needs to learn its private tasks, which arrive continuously independent of communication with the server. We give a rigorous mathematical representation of this framework, detail several major challenges faced under this framework, and address the main challenges of combining incremental learning with federated learning (aggregation of heterogeneous output layers and the task transformation mutual knowledge problem), and show the lower and upper baselines of the framework.
format Online
Article
Text
id pubmed-9654996
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-96549962022-11-15 New Generation Federated Learning Li, Boyuan Chen, Shengbo Peng, Zihao Sensors (Basel) Article With the development of the Internet of things (IoT), federated learning (FL) has received increasing attention as a distributed machine learning (ML) framework that does not require data exchange. However, current FL frameworks follow an idealized setup in which the task size is fixed and the storage space is unlimited, which is impossible in the real world. In fact, new classes of these participating clients always emerge over time, and some samples are overwritten or discarded due to storage limitations. We urgently need a new framework to adapt to the dynamic task sequences and strict storage constraints in the real world. Continuous learning or incremental learning is the ultimate goal of deep learning, and we introduce incremental learning into FL to describe a new federated learning framework. New generation federated learning (NGFL) is probably the most desirable framework for FL, in which, in addition to the basic task of training the server, each client needs to learn its private tasks, which arrive continuously independent of communication with the server. We give a rigorous mathematical representation of this framework, detail several major challenges faced under this framework, and address the main challenges of combining incremental learning with federated learning (aggregation of heterogeneous output layers and the task transformation mutual knowledge problem), and show the lower and upper baselines of the framework. MDPI 2022-11-03 /pmc/articles/PMC9654996/ /pubmed/36366172 http://dx.doi.org/10.3390/s22218475 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Li, Boyuan
Chen, Shengbo
Peng, Zihao
New Generation Federated Learning
title New Generation Federated Learning
title_full New Generation Federated Learning
title_fullStr New Generation Federated Learning
title_full_unstemmed New Generation Federated Learning
title_short New Generation Federated Learning
title_sort new generation federated learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9654996/
https://www.ncbi.nlm.nih.gov/pubmed/36366172
http://dx.doi.org/10.3390/s22218475
work_keys_str_mv AT liboyuan newgenerationfederatedlearning
AT chenshengbo newgenerationfederatedlearning
AT pengzihao newgenerationfederatedlearning