Cargando…
Incorporating Linguistic Knowledge for Learning Distributed Word Representations
Combined with neural language models, distributed word representations achieve significant advantages in computational linguistics and text mining. Most existing models estimate distributed word vectors from large-scale data in an unsupervised fashion, which, however, do not take rich linguistic kno...
Autores principales: | Wang, Yan, Liu, Zhiyuan, Sun, Maosong |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Public Library of Science
2015
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4395361/ https://www.ncbi.nlm.nih.gov/pubmed/25874581 http://dx.doi.org/10.1371/journal.pone.0118437 |
Ejemplares similares
-
Incorporating domain knowledge in chemical and biomedical named entity recognition with word representations
por: Munkhdalai, Tsendsuren, et al.
Publicado: (2015) -
Linguistic representation
por: Rosenberg, Jay F.
Publicado: (1974) -
Cross-linguistic conditions on word length
por: Wichmann, Søren, et al.
Publicado: (2023) -
Understanding the Phonetic Characteristics of Speech Under Uncertainty—Implications of the Representation of Linguistic Knowledge in Learning and Processing
por: Tomaschek, Fabian, et al.
Publicado: (2022) -
Thai Word Segmentation with a Brain-Inspired Sparse Distributed Representations Learning Memory
por: Soisoonthorn, Thasayu, et al.
Publicado: (2023)