Cargando…
Information Bottleneck Analysis by a Conditional Mutual Information Bound
Task-nuisance decomposition describes why the information bottleneck loss [Formula: see text] is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y, [Formula: see text] can be decreased by reduc...
Autores principales: | , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8391358/ https://www.ncbi.nlm.nih.gov/pubmed/34441114 http://dx.doi.org/10.3390/e23080974 |
Sumario: | Task-nuisance decomposition describes why the information bottleneck loss [Formula: see text] is a suitable objective for supervised learning. The true category y is predicted for input x using latent variables z. When n is a nuisance independent from y, [Formula: see text] can be decreased by reducing [Formula: see text] since the latter upper bounds the former. We extend this framework by demonstrating that conditional mutual information [Formula: see text] provides an alternative upper bound for [Formula: see text]. This bound is applicable even if z is not a sufficient representation of x, that is, [Formula: see text]. We used mutual information neural estimation (MINE) to estimate [Formula: see text]. Experiments demonstrated that [Formula: see text] is smaller than [Formula: see text] for layers closer to the input, matching the claim that the former is a tighter bound than the latter. Because of this difference, the information plane differs when [Formula: see text] is used instead of [Formula: see text]. |
---|