Cargando…
A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction
In this paper, we propose to quantitatively compare loss functions based on parameterized Tsallis–Havrda–Charvat entropy and classical Shannon entropy for the training of a deep network in the case of small datasets which are usually encountered in medical applications. Shannon cross-entropy is wide...
Autores principales: | Brochet, Thibaud, Lapuyade-Lahorgue, Jérôme, Huat, Alexandre, Thureau, Sébastien, Pasquier, David, Gardin, Isabelle, Modzelewski, Romain, Gibon, David, Thariat, Juliette, Grégoire, Vincent, Vera, Pierre, Ruan, Su |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9031340/ https://www.ncbi.nlm.nih.gov/pubmed/35455101 http://dx.doi.org/10.3390/e24040436 |
Ejemplares similares
-
Correction: Brochet et al. A Quantitative Comparison between Shannon and Tsallis–Havrda–Charvat Entropies Applied to Cancer Outcome Prediction. Entropy 2022, 24, 436
por: Brochet, Thibaud, et al.
Publicado: (2022) -
Evaluation of Surrogate Endpoints Using Information-Theoretic Measure of Association Based on Havrda and Charvat Entropy
por: del Carmen Pardo, María, et al.
Publicado: (2022) -
Software Code Smell Prediction Model Using Shannon, Rényi and Tsallis Entropies
por: Gupta, Aakanshi, et al.
Publicado: (2018) -
On Conditional Tsallis Entropy
por: Teixeira, Andreia, et al.
Publicado: (2021) -
First Digits’ Shannon Entropy
por: Kreiner, Welf Alfred
Publicado: (2022)