Cargando…
Mutual Information Gain and Linear/Nonlinear Redundancy for Agent Learning, Sequence Analysis, and Modeling
In many applications, intelligent agents need to identify any structure or apparent randomness in an environment and respond appropriately. We use the relative entropy to separate and quantify the presence of both linear and nonlinear redundancy in a sequence and we introduce the new quantities of t...
Autor principal: | Gibson, Jerry D. |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7517148/ https://www.ncbi.nlm.nih.gov/pubmed/33286380 http://dx.doi.org/10.3390/e22060608 |
Ejemplares similares
-
Entropy Power, Autoregressive Models, and Mutual Information
por: Gibson, Jerry
Publicado: (2018) -
Mutual Information and Multi-Agent Systems
por: Moskowitz, Ira S., et al.
Publicado: (2022) -
Mutual information: Measuring nonlinear dependence in longitudinal epidemiological data
por: Young, Alexander L., et al.
Publicado: (2023) -
Testing Nonlinearity with Rényi and Tsallis Mutual Information with an Application in the EKC Hypothesis
por: Tuna, Elif, et al.
Publicado: (2022) -
Automatic Seizure Detection Based on Nonlinear Dynamical Analysis of EEG Signals and Mutual Information
por: Akbarian, Behnaz, et al.
Publicado: (2018)