Cargando…

Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate

Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the predic...

Descripción completa

Detalles Bibliográficos
Autores principales: Takahashi, Shuntaro, Tanaka-Ishii, Kumiko
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2018
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512401/
https://www.ncbi.nlm.nih.gov/pubmed/33266563
http://dx.doi.org/10.3390/e20110839