Cargando…
Schrödinger's tree—On syntax and neural language models
In the last half-decade, the field of natural language processing (NLP) has undergone two major transitions: the switch to neural networks as the primary modeling paradigm and the homogenization of the training regime (pre-train, then fine-tune). Amidst this process, language models have emerged as...
Autores principales: | Kulmizev, Artur, Nivre, Joakim |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9618648/ https://www.ncbi.nlm.nih.gov/pubmed/36325030 http://dx.doi.org/10.3389/frai.2022.796788 |
Ejemplares similares
-
Syntax and prejudice: ethically-charged biases of a syntax-based hate speech recognizer unveiled
por: Mastromattei, Michele, et al.
Publicado: (2022) -
Comparison of Structural Parsers and Neural Language Models as Surprisal Estimators
por: Oh, Byung-Doh, et al.
Publicado: (2022) -
The neural machine translation models for the low-resource Kazakh–English language pair
por: Karyukin, Vladislav, et al.
Publicado: (2023) -
Extracting relations from texts using vector language models and a neural network classifier
por: Shishaev, Maksim, et al.
Publicado: (2023) -
Judgment aggregation, discursive dilemma and reflective equilibrium: Neural language models as self-improving doxastic agents
por: Betz, Gregor, et al.
Publicado: (2022)