Cargando…
Information-Theoretic Generalization Bounds for Meta-Learning and Applications
Meta-learning, or “learning to learn”, refers to techniques that infer an inductive bias from data corresponding to multiple related tasks with the goal of improving the sample efficiency for new, previously unobserved, tasks. A key performance measure for meta-learning is the meta-generalization ga...
Autores principales: | Jose, Sharu Theresa, Simeone, Osvaldo |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7835863/ https://www.ncbi.nlm.nih.gov/pubmed/33478002 http://dx.doi.org/10.3390/e23010126 |
Ejemplares similares
-
An Information-Theoretic Analysis of the Cost of Decentralization for Learning and Inference under Privacy Constraints
por: Jose, Sharu Theresa, et al.
Publicado: (2022) -
Improved Information-Theoretic Generalization Bounds for Distributed, Federated, and Iterative Learning †
por: Barnes, Leighton Pate, et al.
Publicado: (2022) -
Information Theoretic Measures and Their Applications
por: Rosso, Osvaldo A., et al.
Publicado: (2020) -
A Generalized Information-Theoretic Approach for Bounding the Number of Independent Sets in Bipartite Graphs
por: Sason, Igal
Publicado: (2021) -
Information theoretic bounds on the effectiveness of neural prosthetics
por: Goodman, Ilan N, et al.
Publicado: (2007)