Cargando…
Entropy Power, Autoregressive Models, and Mutual Information
Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals...
Autor principal: | Gibson, Jerry |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512312/ https://www.ncbi.nlm.nih.gov/pubmed/33265839 http://dx.doi.org/10.3390/e20100750 |
Ejemplares similares
-
Belavkin–Staszewski Relative Entropy, Conditional Entropy, and Mutual Information
por: Zhai, Yuan, et al.
Publicado: (2022) -
Portfolio Optimization with a Mean-Entropy-Mutual Information Model
por: Novais, Rodrigo Gonçalves, et al.
Publicado: (2022) -
Mutual Information Gain and Linear/Nonlinear Redundancy for Agent Learning, Sequence Analysis, and Modeling
por: Gibson, Jerry D.
Publicado: (2020) -
MIDER: Network Inference with Mutual Information Distance and Entropy Reduction
por: Villaverde, Alejandro F., et al.
Publicado: (2014) -
A Two-Moment Inequality with Applications to Rényi Entropy and Mutual Information
por: Reeves, Galen
Publicado: (2020)