Cargando…

Entropy Power, Autoregressive Models, and Mutual Information

Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals...

Descripción completa

Detalles Bibliográficos
Autor principal: Gibson, Jerry
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512312/
https://www.ncbi.nlm.nih.gov/pubmed/33265839
http://dx.doi.org/10.3390/e20100750
_version_ 1783586128950984704
author Gibson, Jerry
author_facet Gibson, Jerry
author_sort Gibson, Jerry
collection PubMed
description Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described.
format Online
Article
Text
id pubmed-7512312
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75123122020-11-09 Entropy Power, Autoregressive Models, and Mutual Information Gibson, Jerry Entropy (Basel) Article Autoregressive processes play a major role in speech processing (linear prediction), seismic signal processing, biological signal processing, and many other applications. We consider the quantity defined by Shannon in 1948, the entropy rate power, and show that the log ratio of entropy powers equals the difference in the differential entropy of the two processes. Furthermore, we use the log ratio of entropy powers to analyze the change in mutual information as the model order is increased for autoregressive processes. We examine when we can substitute the minimum mean squared prediction error for the entropy power in the log ratio of entropy powers, thus greatly simplifying the calculations to obtain the differential entropy and the change in mutual information and therefore increasing the utility of the approach. Applications to speech processing and coding are given and potential applications to seismic signal processing, EEG classification, and ECG classification are described. MDPI 2018-09-30 /pmc/articles/PMC7512312/ /pubmed/33265839 http://dx.doi.org/10.3390/e20100750 Text en © 2018 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Gibson, Jerry
Entropy Power, Autoregressive Models, and Mutual Information
title Entropy Power, Autoregressive Models, and Mutual Information
title_full Entropy Power, Autoregressive Models, and Mutual Information
title_fullStr Entropy Power, Autoregressive Models, and Mutual Information
title_full_unstemmed Entropy Power, Autoregressive Models, and Mutual Information
title_short Entropy Power, Autoregressive Models, and Mutual Information
title_sort entropy power, autoregressive models, and mutual information
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512312/
https://www.ncbi.nlm.nih.gov/pubmed/33265839
http://dx.doi.org/10.3390/e20100750
work_keys_str_mv AT gibsonjerry entropypowerautoregressivemodelsandmutualinformation