Cargando…
A Lower Bound on the Differential Entropy of Log-Concave Random Vectors with Applications
We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifica...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512702/ https://www.ncbi.nlm.nih.gov/pubmed/33265276 http://dx.doi.org/10.3390/e20030185 |
Sumario: | We derive a lower bound on the differential entropy of a log-concave random variable X in terms of the p-th absolute moment of X. The new bound leads to a reverse entropy power inequality with an explicit constant, and to new bounds on the rate-distortion function and the channel capacity. Specifically, we study the rate-distortion function for log-concave sources and distortion measure [Formula: see text] , with [Formula: see text] , and we establish that the difference between the rate-distortion function and the Shannon lower bound is at most [Formula: see text] bits, independently of r and the target distortion d. For mean-square error distortion, the difference is at most [Formula: see text] bit, regardless of d. We also provide bounds on the capacity of memoryless additive noise channels when the noise is log-concave. We show that the difference between the capacity of such channels and the capacity of the Gaussian channel with the same noise power is at most [Formula: see text] bit. Our results generalize to the case of a random vector X with possibly dependent coordinates. Our proof technique leverages tools from convex geometry. |
---|