Cargando…

Graph Multihead Attention Pooling with Self-Supervised Learning

Graph neural networks (GNNs), which work with graph-structured data, have attracted considerable attention and achieved promising performance on graph-related tasks. While the majority of existing GNN methods focus on the convolutional operation for encoding the node representations, the graph pooli...

Descripción completa

Detalles Bibliográficos
Autores principales: Wang, Yu, Hu, Liang, Wu, Yang, Gao, Wanfu
Formato: Online Artículo Texto
Lenguaje:English
Publicado: MDPI 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9777688/
https://www.ncbi.nlm.nih.gov/pubmed/36554149
http://dx.doi.org/10.3390/e24121745
_version_ 1784856167144488960
author Wang, Yu
Hu, Liang
Wu, Yang
Gao, Wanfu
author_facet Wang, Yu
Hu, Liang
Wu, Yang
Gao, Wanfu
author_sort Wang, Yu
collection PubMed
description Graph neural networks (GNNs), which work with graph-structured data, have attracted considerable attention and achieved promising performance on graph-related tasks. While the majority of existing GNN methods focus on the convolutional operation for encoding the node representations, the graph pooling operation, which maps the set of nodes into a coarsened graph, is crucial for graph-level tasks. We argue that a well-defined graph pooling operation should avoid the information loss of the local node features and global graph structure. In this paper, we propose a hierarchical graph pooling method based on the multihead attention mechanism, namely GMAPS, which compresses both node features and graph structure into the coarsened graph. Specifically, a multihead attention mechanism is adopted to arrange nodes into a coarsened graph based on their features and structural dependencies between nodes. In addition, to enhance the expressiveness of the cluster representations, a self-supervised mechanism is introduced to maximize the mutual information between the cluster representations and the global representation of the hierarchical graph. Our experimental results show that the proposed GMAPS obtains significant and consistent performance improvements compared with state-of-the-art baselines on six benchmarks from the biological and social domains of graph classification and reconstruction tasks.
format Online
Article
Text
id pubmed-9777688
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher MDPI
record_format MEDLINE/PubMed
spelling pubmed-97776882022-12-23 Graph Multihead Attention Pooling with Self-Supervised Learning Wang, Yu Hu, Liang Wu, Yang Gao, Wanfu Entropy (Basel) Article Graph neural networks (GNNs), which work with graph-structured data, have attracted considerable attention and achieved promising performance on graph-related tasks. While the majority of existing GNN methods focus on the convolutional operation for encoding the node representations, the graph pooling operation, which maps the set of nodes into a coarsened graph, is crucial for graph-level tasks. We argue that a well-defined graph pooling operation should avoid the information loss of the local node features and global graph structure. In this paper, we propose a hierarchical graph pooling method based on the multihead attention mechanism, namely GMAPS, which compresses both node features and graph structure into the coarsened graph. Specifically, a multihead attention mechanism is adopted to arrange nodes into a coarsened graph based on their features and structural dependencies between nodes. In addition, to enhance the expressiveness of the cluster representations, a self-supervised mechanism is introduced to maximize the mutual information between the cluster representations and the global representation of the hierarchical graph. Our experimental results show that the proposed GMAPS obtains significant and consistent performance improvements compared with state-of-the-art baselines on six benchmarks from the biological and social domains of graph classification and reconstruction tasks. MDPI 2022-11-29 /pmc/articles/PMC9777688/ /pubmed/36554149 http://dx.doi.org/10.3390/e24121745 Text en © 2022 by the authors. https://creativecommons.org/licenses/by/4.0/Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
spellingShingle Article
Wang, Yu
Hu, Liang
Wu, Yang
Gao, Wanfu
Graph Multihead Attention Pooling with Self-Supervised Learning
title Graph Multihead Attention Pooling with Self-Supervised Learning
title_full Graph Multihead Attention Pooling with Self-Supervised Learning
title_fullStr Graph Multihead Attention Pooling with Self-Supervised Learning
title_full_unstemmed Graph Multihead Attention Pooling with Self-Supervised Learning
title_short Graph Multihead Attention Pooling with Self-Supervised Learning
title_sort graph multihead attention pooling with self-supervised learning
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9777688/
https://www.ncbi.nlm.nih.gov/pubmed/36554149
http://dx.doi.org/10.3390/e24121745
work_keys_str_mv AT wangyu graphmultiheadattentionpoolingwithselfsupervisedlearning
AT huliang graphmultiheadattentionpoolingwithselfsupervisedlearning
AT wuyang graphmultiheadattentionpoolingwithselfsupervisedlearning
AT gaowanfu graphmultiheadattentionpoolingwithselfsupervisedlearning