Cargando…
PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning
Hypergraphs are powerful tools for modeling complex interactions across various domains, including biomedicine. However, learning meaningful node representations from hypergraphs remains a challenge. Existing supervised methods often lack generalizability, thereby limiting their real-world applicati...
Autores principales: | , , , , |
---|---|
Formato: | Online Artículo Texto |
Lenguaje: | English |
Publicado: |
Cold Spring Harbor Laboratory
2023
|
Materias: | |
Acceso en línea: | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10592843/ https://www.ncbi.nlm.nih.gov/pubmed/37873233 http://dx.doi.org/10.1101/2023.10.01.560404 |
_version_ | 1785124352455344128 |
---|---|
author | Deng, Yihe Zhang, Ruochi Xu, Pan Ma, Jian Gu, Quanquan |
author_facet | Deng, Yihe Zhang, Ruochi Xu, Pan Ma, Jian Gu, Quanquan |
author_sort | Deng, Yihe |
collection | PubMed |
description | Hypergraphs are powerful tools for modeling complex interactions across various domains, including biomedicine. However, learning meaningful node representations from hypergraphs remains a challenge. Existing supervised methods often lack generalizability, thereby limiting their real-world applications. We propose a new method, Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning (PhyGCN), which leverages hypergraph structure for self-supervision to enhance node representations. PhyGCN introduces a unique training strategy that integrates variable hyperedge sizes with self-supervised learning, enabling improved generalization to unseen data. Applications on multi-way chromatin interactions and polypharmacy side-effects demonstrate the effectiveness of PhyGCN. As a generic framework for high-order interaction datasets with abundant unlabeled data, PhyGCN holds strong potential for enhancing hypergraph node representations across various domains. |
format | Online Article Text |
id | pubmed-10592843 |
institution | National Center for Biotechnology Information |
language | English |
publishDate | 2023 |
publisher | Cold Spring Harbor Laboratory |
record_format | MEDLINE/PubMed |
spelling | pubmed-105928432023-10-24 PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning Deng, Yihe Zhang, Ruochi Xu, Pan Ma, Jian Gu, Quanquan bioRxiv Article Hypergraphs are powerful tools for modeling complex interactions across various domains, including biomedicine. However, learning meaningful node representations from hypergraphs remains a challenge. Existing supervised methods often lack generalizability, thereby limiting their real-world applications. We propose a new method, Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning (PhyGCN), which leverages hypergraph structure for self-supervision to enhance node representations. PhyGCN introduces a unique training strategy that integrates variable hyperedge sizes with self-supervised learning, enabling improved generalization to unseen data. Applications on multi-way chromatin interactions and polypharmacy side-effects demonstrate the effectiveness of PhyGCN. As a generic framework for high-order interaction datasets with abundant unlabeled data, PhyGCN holds strong potential for enhancing hypergraph node representations across various domains. Cold Spring Harbor Laboratory 2023-10-02 /pmc/articles/PMC10592843/ /pubmed/37873233 http://dx.doi.org/10.1101/2023.10.01.560404 Text en https://creativecommons.org/licenses/by-nc-nd/4.0/This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 International License (https://creativecommons.org/licenses/by-nc-nd/4.0/) , which allows reusers to copy and distribute the material in any medium or format in unadapted form only, for noncommercial purposes only, and only so long as attribution is given to the creator. |
spellingShingle | Article Deng, Yihe Zhang, Ruochi Xu, Pan Ma, Jian Gu, Quanquan PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title | PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title_full | PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title_fullStr | PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title_full_unstemmed | PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title_short | PhyGCN: Pre-trained Hypergraph Convolutional Neural Networks with Self-supervised Learning |
title_sort | phygcn: pre-trained hypergraph convolutional neural networks with self-supervised learning |
topic | Article |
url | https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10592843/ https://www.ncbi.nlm.nih.gov/pubmed/37873233 http://dx.doi.org/10.1101/2023.10.01.560404 |
work_keys_str_mv | AT dengyihe phygcnpretrainedhypergraphconvolutionalneuralnetworkswithselfsupervisedlearning AT zhangruochi phygcnpretrainedhypergraphconvolutionalneuralnetworkswithselfsupervisedlearning AT xupan phygcnpretrainedhypergraphconvolutionalneuralnetworkswithselfsupervisedlearning AT majian phygcnpretrainedhypergraphconvolutionalneuralnetworkswithselfsupervisedlearning AT guquanquan phygcnpretrainedhypergraphconvolutionalneuralnetworkswithselfsupervisedlearning |