Cargando…
G2Basy: A framework to improve the RNN language model and ease overfitting problem
Recurrent neural networks are efficient ways of training language models, and various RNN networks have been proposed to improve performance. However, with the increase of network scales, the overfitting problem becomes more urgent. In this paper, we propose a framework—G2Basy—to speed up the traini...
Autores principales: | Yuwen, Lu, Chen, Shuyu, Yuan, Xiaohan |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2021
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8046238/ https://www.ncbi.nlm.nih.gov/pubmed/33852595 http://dx.doi.org/10.1371/journal.pone.0249820 |
Ejemplares similares
-
The overfitted brain hypothesis
por: Prince, Luke Y., et al.
Publicado: (2021) -
Hands-on training about overfitting
por: Demšar, Janez, et al.
Publicado: (2021) -
Overfitting Bayesian Mixture Models with an Unknown Number of Components
por: van Havre, Zoé, et al.
Publicado: (2015) -
RNN Language Processing Model-Driven Spoken Dialogue System Modeling Method
por: Zhu, Xia
Publicado: (2022) -
RLSTM: A New Framework of Stock Prediction by Using Random Noise for Overfitting Prevention
por: Zheng, Hongying, et al.
Publicado: (2021)