Cargando…
An Information Theoretic Interpretation to Deep Neural Networks †
With the unprecedented performance achieved by deep learning, it is commonly believed that deep neural networks (DNNs) attempt to extract informative features for learning tasks. To formalize this intuition, we apply the local information geometric analysis and establish an information-theoretic fra...
Autores principales: | Xu, Xiangxiang, Huang, Shao-Lun, Zheng, Lizhong, Wornell, Gregory W. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8774347/ https://www.ncbi.nlm.nih.gov/pubmed/35052161 http://dx.doi.org/10.3390/e24010135 |
Ejemplares similares
-
Information Theoretic-Based Interpretation of a Deep Neural Network Approach in Diagnosing Psychogenic Non-Epileptic Seizures
por: Gasparini, Sara, et al.
Publicado: (2018) -
Interpreting Deep Neural Networks and their Predictions
por: Samek, Wojciech
Publicado: (2018) -
Reliable interpretability of biology-inspired deep neural networks
por: Esser-Skala, Wolfgang, et al.
Publicado: (2023) -
Deep neural network for remote-sensing image interpretation: status and perspectives
por: Li, Jiayi, et al.
Publicado: (2019) -
Automated interpretation of the coronary angioscopy with deep convolutional neural networks
por: Miyoshi, Toru, et al.
Publicado: (2020)