Cargando…
Exploring deep neural networks via layer-peeled model: Minority collapse in imbalanced training
In this paper, we introduce the Layer-Peeled Model, a nonconvex, yet analytically tractable, optimization program, in a quest to better understand deep neural networks that are trained for a sufficiently long time. As the name suggests, this model is derived by isolating the topmost layer from the r...
Autores principales: | Fang, Cong, He, Hangfeng, Long, Qi, Su, Weijie J. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
National Academy of Sciences
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8639364/ https://www.ncbi.nlm.nih.gov/pubmed/34675075 http://dx.doi.org/10.1073/pnas.2103091118 |
Ejemplares similares
-
A law of data separation in deep learning
por: He, Hangfeng, et al.
Publicado: (2023) -
Prevalence of neural collapse during the terminal phase of deep learning training
por: Papyan, Vardan, et al.
Publicado: (2020) -
Automatic diagnosis of imbalanced ophthalmic images using a cost-sensitive deep convolutional neural network
por: Jiang, Jiewei, et al.
Publicado: (2017) -
Dynamics in Deep Classifiers Trained with the Square Loss: Normalization, Low Rank, Neural Collapse, and Generalization Bounds
por: Xu, Mengjia, et al.
Publicado: (2023) -
A novel perceptual two layer image fusion using deep learning for imbalanced COVID-19 dataset
por: Elzeki, Omar M., et al.
Publicado: (2021)