Cargando…
Unsupervised Word Embedding Learning by Incorporating Local and Global Contexts
Word embedding has benefited a broad spectrum of text analysis tasks by learning distributed word representations to encode word semantics. Word representations are typically learned by modeling local contexts of words, assuming that words sharing similar surrounding words are semantically close. We...
Autores principales: | Meng, Yu, Huang, Jiaxin, Wang, Guangyuan, Wang, Zihan, Zhang, Chao, Han, Jiawei |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Frontiers Media S.A.
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7931948/ https://www.ncbi.nlm.nih.gov/pubmed/33693384 http://dx.doi.org/10.3389/fdata.2020.00009 |
Ejemplares similares
-
Attending Over Triads for Learning Signed Network Embedding
por: Sodhani, Shagun, et al.
Publicado: (2019) -
A World Full of Stereotypes? Further Investigation on Origin and Gender Bias in Multi-Lingual Word Embeddings
por: Kurpicz-Briki, Mascha, et al.
Publicado: (2021) -
Unsupervised domain adaptation methods for cross-species transfer of regulatory code signals
por: Latyshev, Pavel, et al.
Publicado: (2023) -
Proximity-Based Compression for Network Embedding
por: Islam, Muhammad Ifte, et al.
Publicado: (2021) -
Modern Hopfield Networks for graph embedding
por: Liang, Yuchen, et al.
Publicado: (2022)