Cargando…
Graph Multihead Attention Pooling with Self-Supervised Learning
Graph neural networks (GNNs), which work with graph-structured data, have attracted considerable attention and achieved promising performance on graph-related tasks. While the majority of existing GNN methods focus on the convolutional operation for encoding the node representations, the graph pooli...
Autores principales: | Wang, Yu, Hu, Liang, Wu, Yang, Gao, Wanfu |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
MDPI
2022
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9777688/ https://www.ncbi.nlm.nih.gov/pubmed/36554149 http://dx.doi.org/10.3390/e24121745 |
Ejemplares similares
-
Document-Level Biomedical Relation Extraction Using Graph Convolutional Network and Multihead Attention: Algorithm Development and Validation
por: Wang, Jian, et al.
Publicado: (2020) -
Medical Text Classification Using Hybrid Deep Learning Models with Multihead Attention
por: Prabhakar, Sunil Kumar, et al.
Publicado: (2021) -
Chemical–protein interaction extraction via contextualized word representations and multihead attention
por: Zhang, Yijia, et al.
Publicado: (2019) -
Incorporating representation learning and multihead attention to improve biomedical cross-sentence n-ary relation extraction
por: Zhao, Di, et al.
Publicado: (2020) -
Research on Named Entity Recognition Method of Metro On-Board Equipment Based on Multiheaded Self-Attention Mechanism and CNN-BiLSTM-CRF
por: Lin, Junting, et al.
Publicado: (2022)