Cargando…
Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
The nature of dependence between random variables has always been the subject of many statistical problems for over a century. Yet today, there is a great deal of research on this topic, especially focusing on the analysis of nonlinearity. Shannon mutual information has been considered to be the mos...
Autores principales: | Tuna, Elif, Evren, Atıf, Ustaoğlu, Erhan, Şahin, Büşra, Şahinbaşoğlu, Zehra Zeynep |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9857815/ https://www.ncbi.nlm.nih.gov/pubmed/36673220 http://dx.doi.org/10.3390/e25010079 |
Ejemplares similares
-
A Direct Link between Rényi–Tsallis Entropy and Hölder’s Inequality—Yet Another Proof of Rényi–Tsallis Entropy Maximization
por: Tanaka, Hisa-Aki, et al.
Publicado: (2019) -
Rényi and Tsallis Entropies of the Aharonov–Bohm Ring in Uniform Magnetic Fields
por: Olendski, Oleg
Publicado: (2019) -
Residual and Past Discrete Tsallis and Renyi Extropy with an Application to Softmax Function
por: Jawa, Taghreed M., et al.
Publicado: (2022) -
Software Code Smell Prediction Model Using Shannon, Rényi and Tsallis Entropies
por: Gupta, Aakanshi, et al.
Publicado: (2018) -
Conditional Rényi Divergence Saddlepoint and the Maximization of α-Mutual Information
por: Cai, Changxiao, et al.
Publicado: (2019)