Cargando…

Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training

This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism togeth...

Descripción completa

Detalles Bibliográficos
Autores principales: Zamora-Martínez, Francisco J., España-Boquera, Salvador, Castro-Bleda, Maria Jose, Palacios-Corella, Adrian
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Public Library of Science 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6062053/
https://www.ncbi.nlm.nih.gov/pubmed/30048480
http://dx.doi.org/10.1371/journal.pone.0200884
_version_ 1783342328240406528
author Zamora-Martínez, Francisco J.
España-Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios-Corella, Adrian
author_facet Zamora-Martínez, Francisco J.
España-Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios-Corella, Adrian
author_sort Zamora-Martínez, Francisco J.
collection PubMed
description This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism together with the precomputation of softmax normalization constants for these inputs. The proposed approach is empirically validated, showing their capability to emulate lower order N-grams with a single Neural Network. A machine translation task shows that the proposed model constitutes a good solution to the normalization cost of the output softmax layer of Neural Networks, for some practical cases, without a significant impact in performance while improving the system speed.
format Online
Article
Text
id pubmed-6062053
institution National Center for Biotechnology Information
language English
publishDate 2018
publisher Public Library of Science
record_format MEDLINE/PubMed
spelling pubmed-60620532018-08-03 Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training Zamora-Martínez, Francisco J. España-Boquera, Salvador Castro-Bleda, Maria Jose Palacios-Corella, Adrian PLoS One Research Article This paper presents a new method to reduce the computational cost when using Neural Networks as Language Models, during recognition, in some particular scenarios. It is based on a Neural Network that considers input contexts of different length in order to ease the use of a fallback mechanism together with the precomputation of softmax normalization constants for these inputs. The proposed approach is empirically validated, showing their capability to emulate lower order N-grams with a single Neural Network. A machine translation task shows that the proposed model constitutes a good solution to the normalization cost of the output softmax layer of Neural Networks, for some practical cases, without a significant impact in performance while improving the system speed. Public Library of Science 2018-07-26 /pmc/articles/PMC6062053/ /pubmed/30048480 http://dx.doi.org/10.1371/journal.pone.0200884 Text en © 2018 Zamora-Martínez et al http://creativecommons.org/licenses/by/4.0/ This is an open access article distributed under the terms of the Creative Commons Attribution License (http://creativecommons.org/licenses/by/4.0/) , which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.
spellingShingle Research Article
Zamora-Martínez, Francisco J.
España-Boquera, Salvador
Castro-Bleda, Maria Jose
Palacios-Corella, Adrian
Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title_full Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title_fullStr Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title_full_unstemmed Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title_short Fallback Variable History NNLMs: Efficient NNLMs by precomputation and stochastic training
title_sort fallback variable history nnlms: efficient nnlms by precomputation and stochastic training
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6062053/
https://www.ncbi.nlm.nih.gov/pubmed/30048480
http://dx.doi.org/10.1371/journal.pone.0200884
work_keys_str_mv AT zamoramartinezfranciscoj fallbackvariablehistorynnlmsefficientnnlmsbyprecomputationandstochastictraining
AT espanaboquerasalvador fallbackvariablehistorynnlmsefficientnnlmsbyprecomputationandstochastictraining
AT castrobledamariajose fallbackvariablehistorynnlmsefficientnnlmsbyprecomputationandstochastictraining
AT palacioscorellaadrian fallbackvariablehistorynnlmsefficientnnlmsbyprecomputationandstochastictraining