Cargando…
Recognizing Semantic Relations: Attention-Based Transformers vs. Recurrent Models
Automatically recognizing an existing semantic relation (such as “is a”, “part of”, “property of”, “opposite of” etc.) between two arbitrary words (phrases, concepts, etc.) is an important task affecting many information retrieval and artificial intelligence tasks including query expansion, common-s...
Autores principales: | Roussinov, Dmitri, Sharoff, Serge, Puchnina, Nadezhda |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
2020
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7148207/ http://dx.doi.org/10.1007/978-3-030-45439-5_37 |
Ejemplares similares
-
Predicting Clinical Events Based on Raw Text: From Bag-of-Words to Attention-Based Transformers
por: Roussinov, Dmitri, et al.
Publicado: (2022) -
Transformer-Based Model with Dynamic Attention Pyramid Head for Semantic Segmentation of VHR Remote Sensing Imagery
por: Xu, Yufen, et al.
Publicado: (2022) -
BertSRC: transformer-based semantic relation classification
por: Lee, Yeawon, et al.
Publicado: (2022) -
Bio-semantic relation extraction with attention-based external knowledge reinforcement
por: Li, Zhijing, et al.
Publicado: (2020) -
Temporal Relation Extraction with Joint Semantic and Syntactic Attention
por: Jin, Panpan, et al.
Publicado: (2022)