Cargando…
Communication-efficient federated learning via knowledge distillation
Federated learning is a privacy-preserving machine learning technique to train intelligent models from decentralized data, which enables exploiting private data by communicating local model updates in each iteration of model learning rather than the raw data. However, model updates can be extremely...
Autores principales: | Wu, Chuhan, Wu, Fangzhao, Lyu, Lingjuan, Huang, Yongfeng, Xie, Xing |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9018897/ https://www.ncbi.nlm.nih.gov/pubmed/35440643 http://dx.doi.org/10.1038/s41467-022-29763-x |
Ejemplares similares
-
Differentially private knowledge transfer for federated learning
por: Qi, Tao, et al.
Publicado: (2023) -
A federated graph neural network framework for privacy-preserving personalization
por: Wu, Chuhan, et al.
Publicado: (2022) -
Federated Learning via Augmented Knowledge Distillation for Heterogenous Deep Human Activity Recognition Systems
por: Gad, Gad, et al.
Publicado: (2022) -
A federated learning framework based on transfer learning and knowledge distillation for targeted advertising
por: Su, Caiyu, et al.
Publicado: (2023) -
FedRAD: Heterogeneous Federated Learning via Relational Adaptive Distillation
por: Tang, Jianwu, et al.
Publicado: (2023)