Cargando…

Sequence-to-sequence pretraining for a less-resourced Slovenian language

INTRODUCTION: Large pretrained language models have recently conquered the area of natural language processing. As an alternative to predominant masked language modeling introduced in BERT, the T5 model has introduced a more general training objective, namely sequence to sequence transformation, whi...

Descripción completa

Detalles Bibliográficos
Autores principales: Ulčar, Matej, Robnik-Šikonja, Marko
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Frontiers Media S.A. 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10086348/
https://www.ncbi.nlm.nih.gov/pubmed/37056912
http://dx.doi.org/10.3389/frai.2023.932519