Cargando…
Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning †
We consider information-theoretic bounds on the expected generalization error for statistical learning problems in a network setting. In this setting, there are K nodes, each with its own independent dataset, and the models from the K nodes have to be aggregated into a final centralized model. We co...
Autores principales: | Barnes, Leighton Pate, Dytso, Alex, Poor, Harold Vincent |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9498125/ https://www.ncbi.nlm.nih.gov/pubmed/36141064 http://dx.doi.org/10.3390/e24091178 |
Ejemplares similares
-
Finite-Sample Bounds on the Accuracy of Plug-In Estimators of Fisher Information
por: Cao, Wei, et al.
Publicado: (2021) -
Amplitude Constrained MIMO Channels: Properties of Optimal Input Distributions and Bounds on the Capacity †
por: Dytso, Alex, et al.
Publicado: (2019) -
Information-Theoretic Generalization Bounds for Meta-Learning and Applications
por: Jose, Sharu Theresa, et al.
Publicado: (2021) -
A Generalized Information-Theoretic Approach for Bounding the Number of Independent Sets in Bipartite Graphs
por: Sason, Igal
Publicado: (2021) -
Information theoretic bounds on the effectiveness of neural prosthetics
por: Goodman, Ilan N, et al.
Publicado: (2007)