Cargando…
Do neural nets learn statistical laws behind natural language?
The performance of deep learning in natural language processing has been spectacular, but the reasons for this success remain unclear because of the inherent complexity of deep learning. This paper provides empirical evidence of its effectiveness and of a limitation of neural networks for language e...
Autores principales: | Takahashi, Shuntaro, Tanaka-Ishii, Kumiko |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2017
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC5747447/ https://www.ncbi.nlm.nih.gov/pubmed/29287076 http://dx.doi.org/10.1371/journal.pone.0189326 |
Ejemplares similares
-
Cross Entropy of Neural Language Models at Infinity—A New Bound of the Entropy Rate
por: Takahashi, Shuntaro, et al.
Publicado: (2018) -
Menzerath’s Law in the Syntax of Languages Compared with Random Sentences
por: Tanaka-Ishii, Kumiko
Publicado: (2021) -
Statistical universals of language: mathematical chance vs. human choice
por: Tanaka-Ishii, Kumiko
Publicado: (2021) -
Entropy Rate Estimation for English via a Large Cognitive Experiment Using Mechanical Turk
por: Ren, Geng, et al.
Publicado: (2019) -
Semi-Supervised Learning of Statistical Models for Natural Language Understanding
por: Zhou, Deyu, et al.
Publicado: (2014)