Cargando…
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
Although Shannon mutual information has been widely used, its effective calculation is often difficult for many practical problems, including those in neural population coding. Asymptotic formulas based on Fisher information sometimes provide accurate approximations to the mutual information but thi...
Autores principales: | Huang, Wentao, Zhang, Kechen |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514724/ https://www.ncbi.nlm.nih.gov/pubmed/33266958 http://dx.doi.org/10.3390/e21030243 |
Ejemplares similares
-
An information geometrical evaluation of Shannon information metrics on a discrete n-dimensional digital manifold
por: Koltuksuz, Ahmet, et al.
Publicado: (2023) -
Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples
por: Hernández, Damián G., et al.
Publicado: (2019) -
Mutual Information between Discrete Variables with Many Categories using Recursive Adaptive Partitioning
por: Seok, Junhee, et al.
Publicado: (2015) -
Mutual Information between Discrete and Continuous Data Sets
por: Ross, Brian C.
Publicado: (2014) -
Global Seismic Nowcasting With Shannon Information Entropy
por: Rundle, John B., et al.
Publicado: (2019)