Cargando…
Syntactically-informed word representations from graph neural network
Most deep language understanding models depend only on word representations, which are mainly based on language modelling derived from a large amount of raw text. These models encode distributional knowledge without considering syntactic structural information, although several studies have shown be...
Autores principales: | Tran, Thy Thy, Miwa, Makoto, Ananiadou, Sophia |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Elsevier Science Publishers
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7593959/ https://www.ncbi.nlm.nih.gov/pubmed/33162674 http://dx.doi.org/10.1016/j.neucom.2020.06.070 |
Ejemplares similares
-
DeepEventMine: end-to-end neural nested event extraction from biomedical texts
por: Trieu, Hai-Long, et al.
Publicado: (2020) -
Adverse drug events and medication relation extraction in electronic health records with ensemble deep learning methods
por: Christopoulou, Fenia, et al.
Publicado: (2019) -
Attention enhanced capsule network for text classification by encoding syntactic dependency trees with graph convolutional neural network
por: Jia, Xudong, et al.
Publicado: (2022) -
Word Order Variation is Partially Constrained by Syntactic Complexity
por: Jing, Yingqi, et al.
Publicado: (2021) -
Graph Neural Network for representation learning of lung cancer
por: Aftab, Rukhma, et al.
Publicado: (2023)