Cargando…
On cheap entropy-sparsified regression learning
Regression learning is one of the long-standing problems in statistics, machine learning, and deep learning (DL). We show that writing this problem as a probabilistic expectation over (unknown) feature probabilities – thus increasing the number of unknown parameters and seemingly making the problem...
Autores principales: | Horenko, Illia, Vecchi, Edoardo, Kardoš, Juraj, Wächter, Andreas, Schenk, Olaf, O’Kane, Terence J., Gagliardini, Patrick, Gerber, Susanne |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9910478/ https://www.ncbi.nlm.nih.gov/pubmed/36580592 http://dx.doi.org/10.1073/pnas.2214972120 |
Ejemplares similares
-
Cheap robust learning of data anomalies with analytically solvable entropic outlier sparsification
por: Horenko, Illia
Publicado: (2022) -
Improving clustering by imposing network information
por: Gerber, Susanne, et al.
Publicado: (2015) -
Low-Cost Probabilistic 3D Denoising with Applications for Ultra-Low-Radiation Computed Tomography
por: Horenko, Illia, et al.
Publicado: (2022) -
Sparsifying priors for Bayesian uncertainty quantification in model discovery
por: Hirsh, Seth M., et al.
Publicado: (2022) -
A scalable approach to the computation of invariant measures for high-dimensional Markovian systems
por: Gerber, Susanne, et al.
Publicado: (2018)