Cargando…
The Conditional Entropy Bottleneck
Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets. We characterize these as failures of robust generaliza...
Autor principal: | |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7597329/ https://www.ncbi.nlm.nih.gov/pubmed/33286768 http://dx.doi.org/10.3390/e22090999 |
_version_ | 1783602323460718592 |
---|---|
author | Fischer, Ian |
author_facet | Fischer, Ian |
author_sort | Fischer, Ian |
collection | PubMed |
description | Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets. We characterize these as failures of robust generalization, which extends the traditional measure of generalization as accuracy or related metrics on a held-out set. We hypothesize that these failures to robustly generalize are due to the learning systems retaining too much information about the training data. To test this hypothesis, we propose the Minimum Necessary Information (MNI) criterion for evaluating the quality of a model. In order to train models that perform well with respect to the MNI criterion, we present a new objective function, the Conditional Entropy Bottleneck (CEB), which is closely related to the Information Bottleneck (IB). We experimentally test our hypothesis by comparing the performance of CEB models with deterministic models and Variational Information Bottleneck (VIB) models on a variety of different datasets and robustness challenges. We find strong empirical evidence supporting our hypothesis that MNI models improve on these problems of robust generalization. |
format | Online Article Text |
id | pubmed-7597329 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2020 |
publisher | MDPI |
record_format | MEDLINE/PubMed |
spelling | pubmed-75973292020-11-09 The Conditional Entropy Bottleneck Fischer, Ian Entropy (Basel) Article Much of the field of Machine Learning exhibits a prominent set of failure modes, including vulnerability to adversarial examples, poor out-of-distribution (OoD) detection, miscalibration, and willingness to memorize random labelings of datasets. We characterize these as failures of robust generalization, which extends the traditional measure of generalization as accuracy or related metrics on a held-out set. We hypothesize that these failures to robustly generalize are due to the learning systems retaining too much information about the training data. To test this hypothesis, we propose the Minimum Necessary Information (MNI) criterion for evaluating the quality of a model. In order to train models that perform well with respect to the MNI criterion, we present a new objective function, the Conditional Entropy Bottleneck (CEB), which is closely related to the Information Bottleneck (IB). We experimentally test our hypothesis by comparing the performance of CEB models with deterministic models and Variational Information Bottleneck (VIB) models on a variety of different datasets and robustness challenges. We find strong empirical evidence supporting our hypothesis that MNI models improve on these problems of robust generalization. MDPI 2020-09-08 /pmc/articles/PMC7597329/ /pubmed/33286768 http://dx.doi.org/10.3390/e22090999 Text en © 2020 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/). |
spellingShingle | Article Fischer, Ian The Conditional Entropy Bottleneck |
title | The Conditional Entropy Bottleneck |
title_full | The Conditional Entropy Bottleneck |
title_fullStr | The Conditional Entropy Bottleneck |
title_full_unstemmed | The Conditional Entropy Bottleneck |
title_short | The Conditional Entropy Bottleneck |
title_sort | conditional entropy bottleneck |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7597329/ https://www.ncbi.nlm.nih.gov/pubmed/33286768 http://dx.doi.org/10.3390/e22090999 |
work_keys_str_mv | AT fischerian theconditionalentropybottleneck AT fischerian conditionalentropybottleneck |