Cargando…

Mixture of Experts with Entropic Regularization for Data Classification

Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. “Mixture-of-experts” is a well-known classification technique; it is a probabilistic model consisting of local ex...

Descripción completa

Detalles Bibliográficos
Autores principales: Peralta, Billy, Saavedra, Ariel, Caro, Luis, Soto, Alvaro
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2019
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514672/
https://www.ncbi.nlm.nih.gov/pubmed/33266905
http://dx.doi.org/10.3390/e21020190
_version_ 1783586642076893184
author Peralta, Billy
Saavedra, Ariel
Caro, Luis
Soto, Alvaro
author_facet Peralta, Billy
Saavedra, Ariel
Caro, Luis
Soto, Alvaro
author_sort Peralta, Billy
collection PubMed
description Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. “Mixture-of-experts” is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a “winner-takes-all” output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3–6% in some datasets. In future work, we plan to embed feature selection into this model.
format Online
Article
Text
id pubmed-7514672
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-75146722020-11-09 Mixture of Experts with Entropic Regularization for Data Classification Peralta, Billy Saavedra, Ariel Caro, Luis Soto, Alvaro Entropy (Basel) Article Today, there is growing interest in the automatic classification of a variety of tasks, such as weather forecasting, product recommendations, intrusion detection, and people recognition. “Mixture-of-experts” is a well-known classification technique; it is a probabilistic model consisting of local expert classifiers weighted by a gate network that is typically based on softmax functions, combined with learnable complex patterns in data. In this scheme, one data point is influenced by only one expert; as a result, the training process can be misguided in real datasets for which complex data need to be explained by multiple experts. In this work, we propose a variant of the regular mixture-of-experts model. In the proposed model, the cost classification is penalized by the Shannon entropy of the gating network in order to avoid a “winner-takes-all” output for the gating network. Experiments show the advantage of our approach using several real datasets, with improvements in mean accuracy of 3–6% in some datasets. In future work, we plan to embed feature selection into this model. MDPI 2019-02-18 /pmc/articles/PMC7514672/ /pubmed/33266905 http://dx.doi.org/10.3390/e21020190 Text en © 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Peralta, Billy
Saavedra, Ariel
Caro, Luis
Soto, Alvaro
Mixture of Experts with Entropic Regularization for Data Classification
title Mixture of Experts with Entropic Regularization for Data Classification
title_full Mixture of Experts with Entropic Regularization for Data Classification
title_fullStr Mixture of Experts with Entropic Regularization for Data Classification
title_full_unstemmed Mixture of Experts with Entropic Regularization for Data Classification
title_short Mixture of Experts with Entropic Regularization for Data Classification
title_sort mixture of experts with entropic regularization for data classification
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514672/
https://www.ncbi.nlm.nih.gov/pubmed/33266905
http://dx.doi.org/10.3390/e21020190
work_keys_str_mv AT peraltabilly mixtureofexpertswithentropicregularizationfordataclassification
AT saavedraariel mixtureofexpertswithentropicregularizationfordataclassification
AT caroluis mixtureofexpertswithentropicregularizationfordataclassification
AT sotoalvaro mixtureofexpertswithentropicregularizationfordataclassification