Cargando…
Sequence-to-sequence pretraining for a less-resourced Slovenian language
INTRODUCTION: Large pretrained language models have recently conquered the area of natural language processing. As an alternative to predominant masked language modeling introduced in BERT, the T5 model has introduced a more general training objective, namely sequence to sequence transformation, whi...
Autores principales: | Ulčar, Matej, Robnik-Šikonja, Marko |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10086348/ https://www.ncbi.nlm.nih.gov/pubmed/37056912 http://dx.doi.org/10.3389/frai.2023.932519 |
Ejemplares similares
-
Explaining pretrained language models' understanding of linguistic structures using construction grammar
por: Weissweiler, Leonie, et al.
Publicado: (2023) -
Critical Analysis of Deconfounded Pretraining to Improve Visio-Linguistic Models
por: Cornille, Nathan, et al.
Publicado: (2022) -
Lung Cancer Segmentation With Transfer Learning: Usefulness of a Pretrained Model Constructed From an Artificial Dataset Generated Using a Generative Adversarial Network
por: Nishio, Mizuho, et al.
Publicado: (2021) -
Emotion Analysis of Arabic Tweets: Language Models and Available Resources
por: Alqahtani, Ghadah, et al.
Publicado: (2022) -
Using Twitter Data for the Study of Language Change in Low-Resource Languages. A Panel Study of Relative Pronouns in Frisian
por: Dijkstra, Jelske, et al.
Publicado: (2021)