Cargando…

On cross-lingual retrieval with multilingual text encoders

Pretrained multilingual text encoders based on neural transformer architectures, such as multilingual BERT (mBERT) and XLM, have recently become a default paradigm for cross-lingual transfer of natural language processing models, rendering cross-lingual word embedding spaces (CLWEs) effectively obso...

Descripción completa

Detalles Bibliográficos
Autores principales: Litschko, Robert, Vulić, Ivan, Ponzetto, Simone Paolo, Glavaš, Goran
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer Netherlands 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9090691/
https://www.ncbi.nlm.nih.gov/pubmed/35573078
http://dx.doi.org/10.1007/s10791-022-09406-x

Ejemplares similares