Cargando…
Languages with more speakers tend to be harder to (machine-)learn
Computational language models (LMs), most notably exemplified by the widespread success of OpenAI's ChatGPT chatbot, show impressive performance on a wide range of linguistic tasks, thus providing cognitive science and linguistics with a computational working model to empirically study differen...
Autores principales: | Koplenig, Alexander, Wolfer, Sascha |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Nature Publishing Group UK
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10613286/ https://www.ncbi.nlm.nih.gov/pubmed/37898699 http://dx.doi.org/10.1038/s41598-023-45373-z |
Ejemplares similares
-
Language structure is influenced by the number of speakers but seemingly not by the proportion of non-native speakers
por: Koplenig, Alexander
Publicado: (2019) -
A large quantitative analysis of written language challenges the idea that all languages are equally complex
por: Koplenig, Alexander, et al.
Publicado: (2023) -
Adaptive Communication: Languages with More Non-Native Speakers Tend to Have Fewer Word Forms
por: Bentz, Christian, et al.
Publicado: (2015) -
Studying Lexical Dynamics and Language Change via Generalized Entropies: The Problem of Sample Size
por: Koplenig, Alexander, et al.
Publicado: (2019) -
Machine Learning for Brain Images Classification of Two Language Speakers
por: Barranco-Gutiérrez, Alejandro-Israel
Publicado: (2020)