Cargando…
Domain Word Extension Using Curriculum Learning
Self-supervised learning models, such as BERT, have improved the performance of various tasks in natural language processing. Although the effect is reduced in the out-of-domain field and not the the trained domain thus representing a limitation, it is difficult to train a new language model for a s...
Autores principales: | Seong, Sujin, Cha, Jeongwon |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10056774/ https://www.ncbi.nlm.nih.gov/pubmed/36991775 http://dx.doi.org/10.3390/s23063064 |
Ejemplares similares
-
Early number word learning: Associations with domain-general and domain-specific quantitative abilities
por: Yang, Meiling, et al.
Publicado: (2022) -
Machine learning and word sense disambiguation in the biomedical domain: design and evaluation issues
por: Xu, Hua, et al.
Publicado: (2006) -
Music Perception Abilities and Ambiguous Word Learning: Is There Cross-Domain Transfer in Nonmusicians?
por: Smit, Eline A., et al.
Publicado: (2022) -
Pooling region learning of visual word for image classification using bag-of-visual-words model
por: Xu, Ye, et al.
Publicado: (2020) -
Extensions To The Rfq Domain
por: Swenson, D A
Publicado: (1990)