Cargando…
Separation of scales and a thermodynamic description of feature learning in some CNNs
Deep neural networks (DNNs) are powerful tools for compressing and distilling information. Their scale and complexity, often involving billions of inter-dependent parameters, render direct microscopic analysis difficult. Under such circumstances, a common strategy is to identify slow variables that...
Autores principales: | Seroussi, Inbar, Naveh, Gadi, Ringel, Zohar |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9938275/ https://www.ncbi.nlm.nih.gov/pubmed/36804926 http://dx.doi.org/10.1038/s41467-023-36361-y |
Ejemplares similares
-
Emergence of Lie Symmetries in Functional Architectures Learned by CNNs
por: Bertoni, Federico, et al.
Publicado: (2021) -
Malicious Code Variant Identification Based on Multiscale Feature Fusion CNNs
por: Wang, Shuo, et al.
Publicado: (2021) -
Biomedical literature classification with a CNNs-based hybrid learning network
por: Yan, Yan, et al.
Publicado: (2018) -
Learn computer vision using OpenCV: with deep learning CNNs and RNNs
por: Gollapudi, Sunila
Publicado: (2019) -
Matching-range-constrained real-time loop closure detection with CNNs features
por: Bai, Dongdong, et al.
Publicado: (2016)