Cargando…
Estimating the Mutual Information between Two Discrete, Asymmetric Variables with Limited Samples
Determining the strength of nonlinear, statistical dependencies between two variables is a crucial matter in many research fields. The established measure for quantifying such relations is the mutual information. However, estimating mutual information from limited samples is a challenging task. Sinc...
Autores principales: | Hernández, Damián G., Samengo, Inés |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7515115/ https://www.ncbi.nlm.nih.gov/pubmed/33267337 http://dx.doi.org/10.3390/e21060623 |
Ejemplares similares
-
Mutual Information between Discrete Variables with Many Categories using Recursive Adaptive Partitioning
por: Seok, Junhee, et al.
Publicado: (2015) -
Mutual Information between Discrete and Continuous Data Sets
por: Ross, Brian C.
Publicado: (2014) -
Inferring a Property of a Large System from a Small Number of Samples
por: Hernández, Damián G., et al.
Publicado: (2022) -
Approximations of Shannon Mutual Information for Discrete Variables with Applications to Neural Population Coding
por: Huang, Wentao, et al.
Publicado: (2019) -
Limitations to Estimating Mutual Information in Large Neural Populations
por: Mölter, Jan, et al.
Publicado: (2020)