Cargando…
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifica...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512702/ https://www.ncbi.nlm.nih.gov/pubmed/33265276 http://dx.doi.org/10.3390/e20030185 |
_version_ | 1783586218834919424 |
---|---|
author | Marsiglietti, Arnaud Kostina, Victoria |
author_facet | Marsiglietti, Arnaud Kostina, Victoria |
author_sort | Marsiglietti, Arnaud |
collection | PubMed |
description | We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure [Formula: see text] , with [Formula: see text] , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most [Formula: see text] bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most [Formula: see text] bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most [Formula: see text] bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. |
format | Online Article Text |
id | pubmed-7512702 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2018 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75127022020-11-09 A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications Marsiglietti, Arnaud Kostina, Victoria Entropy (Basel) Article We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure [Formula: see text] , with [Formula: see text] , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most [Formula: see text] bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most [Formula: see text] bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most [Formula: see text] bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. MDPI 2018-03-09 /pmc/articles/PMC7512702/ /pubmed/33265276 http://dx.doi.org/10.3390/e20030185 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Marsiglietti, Arnaud Kostina, Victoria A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title | A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_full | A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_fullStr | A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_full_unstemmed | A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_short | A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications |
title_sort | lower bound on the differential entropy of log-concave random vectors with applications |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512702/ https://www.ncbi.nlm.nih.gov/pubmed/33265276 http://dx.doi.org/10.3390/e20030185 |
work_keys_str_mv | AT marsigliettiarnaud alowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT kostinavictoria alowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT marsigliettiarnaud lowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications AT kostinavictoria lowerboundonthedifferentialentropyoflogconcaverandomvectorswithapplications |