Cargando…
Entropy Rate Estimation for English via a Large Cognitive Experiment Using Mechanical Turk
The entropy rate h of a natural language quantifies the complexity underlying the language. While recent studies have used computational approaches to estimate this rate, their results rely fundamentally on the performance of the language model used for prediction. On the other hand, in 1951, Shanno...
Autores principales: | Ren, Geng, Takahashi, Shuntaro, Tanaka-Ishii, Kumiko |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2019
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7514546/ http://dx.doi.org/10.3390/e21121201 |
Ejemplares similares
-
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
por: Takahashi, Shuntaro, et al.
Publicado: (2018) -
Retrovirology and young Turks...
por: Jeang, Kuan-Teh
Publicado: (2004) -
Turking in the time of COVID
por: Arechar, Antonio A., et al.
Publicado: (2021) -
Do neural nets learn statistical laws behind natural language?
por: Takahashi, Shuntaro, et al.
Publicado: (2017) -
Yüzyilin deneyinde Türk bilim adamlari
Publicado: (2008)