Cargando…
Shaping the learning landscape in neural networks around wide flat minima
Learning in deep neural networks takes place by minimizing a nonconvex high-dimensional loss function, typically by a stochastic gradient descent (SGD) strategy. The learning process is observed to be able to find good minimizers without getting stuck in local critical points and such minimizers are...
Autores principales: | Baldassi, Carlo, Pittorino, Fabrizio, Zecchina, Riccardo |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6955380/ https://www.ncbi.nlm.nih.gov/pubmed/31871189 http://dx.doi.org/10.1073/pnas.1908636117 |
Ejemplares similares
-
A mean field view of the landscape of two-layer neural networks
por: Mei, Song, et al.
Publicado: (2018) -
Resonance with subthreshold oscillatory drive organizes activity and optimizes learning in neural networks
por: Roach, James P., et al.
Publicado: (2018) -
Fundamental bounds on learning performance in neural circuits
por: Raman, Dhruva Venkita, et al.
Publicado: (2019) -
The immunopeptidomic landscape of ovarian carcinomas
por: Schuster, Heiko, et al.
Publicado: (2017) -
Complex modifier landscape underlying genetic background effects
por: Hou, Jing, et al.
Publicado: (2019)