Cargando…

An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †

This paper introduces an upper bound on the absolute difference between: [Formula: see text] the cumulative distribution function (CDF) of the sum of a finite number of independent and identically distributed random variables with finite absolute third moment; and [Formula: see text] a saddlepoint a...

Descripción completa

Detalles Bibliográficos
Autores principales: Anade, Dadja, Gorce, Jean-Marie, Mary, Philippe, Perlaza, Samir M.
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7517223/
https://www.ncbi.nlm.nih.gov/pubmed/33286462
http://dx.doi.org/10.3390/e22060690
_version_ 1783587180737724416
author Anade, Dadja
Gorce, Jean-Marie
Mary, Philippe
Perlaza, Samir M.
author_facet Anade, Dadja
Gorce, Jean-Marie
Mary, Philippe
Perlaza, Samir M.
author_sort Anade, Dadja
collection PubMed
description This paper introduces an upper bound on the absolute difference between: [Formula: see text] the cumulative distribution function (CDF) of the sum of a finite number of independent and identically distributed random variables with finite absolute third moment; and [Formula: see text] a saddlepoint approximation of such CDF. This upper bound, which is particularly precise in the regime of large deviations, is used to study the dependence testing (DT) bound and the meta converse (MC) bound on the decoding error probability (DEP) in point-to-point memoryless channels. Often, these bounds cannot be analytically calculated and thus lower and upper bounds become particularly useful. Within this context, the main results include, respectively, new upper and lower bounds on the DT and MC bounds. A numerical experimentation of these bounds is presented in the case of the binary symmetric channel, the additive white Gaussian noise channel, and the additive symmetric [Formula: see text]-stable noise channel.
format Online
Article
Text
id pubmed-7517223
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75172232020-11-09 An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory † Anade, Dadja Gorce, Jean-Marie Mary, Philippe Perlaza, Samir M. Entropy (Basel) Article This paper introduces an upper bound on the absolute difference between: [Formula: see text] the cumulative distribution function (CDF) of the sum of a finite number of independent and identically distributed random variables with finite absolute third moment; and [Formula: see text] a saddlepoint approximation of such CDF. This upper bound, which is particularly precise in the regime of large deviations, is used to study the dependence testing (DT) bound and the meta converse (MC) bound on the decoding error probability (DEP) in point-to-point memoryless channels. Often, these bounds cannot be analytically calculated and thus lower and upper bounds become particularly useful. Within this context, the main results include, respectively, new upper and lower bounds on the DT and MC bounds. A numerical experimentation of these bounds is presented in the case of the binary symmetric channel, the additive white Gaussian noise channel, and the additive symmetric [Formula: see text]-stable noise channel. MDPI 2020-06-20 /pmc/articles/PMC7517223/ /pubmed/33286462 http://dx.doi.org/10.3390/e22060690 Text en © 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Anade, Dadja
Gorce, Jean-Marie
Mary, Philippe
Perlaza, Samir M.
An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title_full An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title_fullStr An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title_full_unstemmed An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title_short An Upper Bound on the Error Induced by Saddlepoint Approximations—Applications to Information Theory †
title_sort upper bound on the error induced by saddlepoint approximations—applications to information theory †
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7517223/
https://www.ncbi.nlm.nih.gov/pubmed/33286462
http://dx.doi.org/10.3390/e22060690
work_keys_str_mv AT anadedadja anupperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT gorcejeanmarie anupperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT maryphilippe anupperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT perlazasamirm anupperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT anadedadja upperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT gorcejeanmarie upperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT maryphilippe upperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory
AT perlazasamirm upperboundontheerrorinducedbysaddlepointapproximationsapplicationstoinformationtheory