Cargando…
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
Neural language models have drawn a lot of attention for their strong ability to predict natural language text. In this paper, we estimate the entropy rate of natural language with state-of-the-art neural language models. To obtain the estimate, we consider the cross entropy, a measure of the predic...
Autores principales: | Takahashi, Shuntaro, Tanaka-Ishii, Kumiko |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2018
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7512401/ https://www.ncbi.nlm.nih.gov/pubmed/33266563 http://dx.doi.org/10.3390/e20110839 |
Ejemplares similares
-
Entropy Rate Estimation for English via a Large Cognitive Experiment Using Mechanical Turk
por: Ren, Geng, et al.
Publicado: (2019) -
Do neural nets learn statistical laws behind natural language?
por: Takahashi, Shuntaro, et al.
Publicado: (2017) -
Entropy bounds and isoperimetry
por: Bobkov, SG, et al.
Publicado: (2005) -
A Causal Entropy Bound
por: Brustein, R., et al.
Publicado: (1999) -
Entropy Bounds and String Cosmology
por: Veneziano, G.
Publicado: (1999)