Cargando…

Deconstructing Cross-Entropy for Probabilistic Binary Classifiers

In this work, we analyze the cross-entropy function, widely used in classifiers both as a performance measure and as an optimization objective. We contextualize cross-entropy in the light of Bayesian decision theory, the formal probabilistic framework for making decisions, and we thoroughly analyze...

Descripción completa

Detalles Bibliográficos
Autores principales: Ramos, Daniel, Franco-Pedroso, Javier, Lozano-Diez, Alicia, Gonzalez-Rodriguez, Joaquin
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512723/
https://www.ncbi.nlm.nih.gov/pubmed/33265299
http://dx.doi.org/10.3390/e20030208
_version_ 1783586223760080896
author Ramos, Daniel
Franco-Pedroso, Javier
Lozano-Diez, Alicia
Gonzalez-Rodriguez, Joaquin
author_facet Ramos, Daniel
Franco-Pedroso, Javier
Lozano-Diez, Alicia
Gonzalez-Rodriguez, Joaquin
author_sort Ramos, Daniel
collection PubMed
description In this work, we analyze the cross-entropy function, widely used in classifiers both as a performance measure and as an optimization objective. We contextualize cross-entropy in the light of Bayesian decision theory, the formal probabilistic framework for making decisions, and we thoroughly analyze its motivation, meaning and interpretation from an information-theoretical point of view. In this sense, this article presents several contributions: First, we explicitly analyze the contribution to cross-entropy of (i) prior knowledge; and (ii) the value of the features in the form of a likelihood ratio. Second, we introduce a decomposition of cross-entropy into two components: discrimination and calibration. This decomposition enables the measurement of different performance aspects of a classifier in a more precise way; and justifies previously reported strategies to obtain reliable probabilities by means of the calibration of the output of a discriminating classifier. Third, we give different information-theoretical interpretations of cross-entropy, which can be useful in different application scenarios, and which are related to the concept of reference probabilities. Fourth, we present an analysis tool, the Empirical Cross-Entropy (ECE) plot, a compact representation of cross-entropy and its aforementioned decomposition. We show the power of ECE plots, as compared to other classical performance representations, in two diverse experimental examples: a speaker verification system, and a forensic case where some glass findings are present.
format Online
Article
Text
id pubmed-7512723
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75127232020-11-09 Deconstructing Cross-Entropy for Probabilistic Binary Classifiers Ramos, Daniel Franco-Pedroso, Javier Lozano-Diez, Alicia Gonzalez-Rodriguez, Joaquin Entropy (Basel) Article In this work, we analyze the cross-entropy function, widely used in classifiers both as a performance measure and as an optimization objective. We contextualize cross-entropy in the light of Bayesian decision theory, the formal probabilistic framework for making decisions, and we thoroughly analyze its motivation, meaning and interpretation from an information-theoretical point of view. In this sense, this article presents several contributions: First, we explicitly analyze the contribution to cross-entropy of (i) prior knowledge; and (ii) the value of the features in the form of a likelihood ratio. Second, we introduce a decomposition of cross-entropy into two components: discrimination and calibration. This decomposition enables the measurement of different performance aspects of a classifier in a more precise way; and justifies previously reported strategies to obtain reliable probabilities by means of the calibration of the output of a discriminating classifier. Third, we give different information-theoretical interpretations of cross-entropy, which can be useful in different application scenarios, and which are related to the concept of reference probabilities. Fourth, we present an analysis tool, the Empirical Cross-Entropy (ECE) plot, a compact representation of cross-entropy and its aforementioned decomposition. We show the power of ECE plots, as compared to other classical performance representations, in two diverse experimental examples: a speaker verification system, and a forensic case where some glass findings are present. MDPI 2018-03-20 /pmc/articles/PMC7512723/ /pubmed/33265299 http://dx.doi.org/10.3390/e20030208 Text en © 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Ramos, Daniel
Franco-Pedroso, Javier
Lozano-Diez, Alicia
Gonzalez-Rodriguez, Joaquin
Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title_full Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title_fullStr Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title_full_unstemmed Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title_short Deconstructing Cross-Entropy for Probabilistic Binary Classifiers
title_sort deconstructing cross-entropy for probabilistic binary classifiers
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512723/
https://www.ncbi.nlm.nih.gov/pubmed/33265299
http://dx.doi.org/10.3390/e20030208
work_keys_str_mv AT ramosdaniel deconstructingcrossentropyforprobabilisticbinaryclassifiers
AT francopedrosojavier deconstructingcrossentropyforprobabilisticbinaryclassifiers
AT lozanodiezalicia deconstructingcrossentropyforprobabilisticbinaryclassifiers
AT gonzalezrodriguezjoaquin deconstructingcrossentropyforprobabilisticbinaryclassifiers