Cargando…
On cross-lingual retrieval with multilingual text encoders
Pretrained multilingual text encoders based on neural transformer architectures, such as multilingual BERT (mBERT) and XLM, have recently become a default paradigm for cross-lingual transfer of natural language processing models, rendering cross-lingual word embedding spaces (CLWEs) effectively obso...
Autores principales: | Litschko, Robert, Vulić, Ivan, Ponzetto, Simone Paolo, Glavaš, Goran |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Springer Netherlands
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9090691/ https://www.ncbi.nlm.nih.gov/pubmed/35573078 http://dx.doi.org/10.1007/s10791-022-09406-x |
Ejemplares similares
-
Guest editorial: special issue on ECIR 2021
por: Hiemstra, Djoerd, et al.
Publicado: (2022) -
Deep Multilabel Multilingual Document Learning for Cross-Lingual Document Retrieval
por: Feng, Kai, et al.
Publicado: (2022) -
Cross-lingual word embeddings
por: Søgaard, Anders, et al.
Publicado: (2019) -
African multilingualism viewed from another angle: Challenging the Casamance exception
por: Sagna, Serge, et al.
Publicado: (2021) -
Multilingual assessment of early child development: Analyses from repeated observations of children in Kenya
por: Knauer, Heather A., et al.
Publicado: (2019)