Cargando…

Mobility-Aware Federated Learning Considering Multiple Networks

Federated learning (FL) is a distributed training method for machine learning models (ML) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instance, mobility can hinder training by ca...

Descripción completa

Detalles Bibliográficos
Autores principales: Macedo, Daniel, Santos, Danilo, Perkusich, Angelo, Valadares, Dalton C. G.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10386473/
https://www.ncbi.nlm.nih.gov/pubmed/37514581
http://dx.doi.org/10.3390/s23146286
Descripción
Sumario:Federated learning (FL) is a distributed training method for machine learning models (ML) that maintain data ownership on users. However, this distributed training approach can lead to variations in efficiency due to user behaviors or characteristics. For instance, mobility can hinder training by causing a client dropout when a device loses connection with other devices on the network. To address this issue, we propose a FL coordination algorithm, MoFeL, to ensure efficient training even in scenarios with mobility. Furthermore, MoFeL evaluates multiple networks with different central servers. To evaluate its effectiveness, we conducted simulation experiments using an image classification application that utilizes machine models trained by a convolutional neural network. The simulation results demonstrate that MoFeL outperforms traditional training coordination algorithms in FL, with [Formula: see text] more training cycles, in scenarios with high mobility compared to an algorithm that does not consider mobility aspects.