Cargando…

Hierarchical graph transformer with contrastive learning for protein function prediction

MOTIVATION: In recent years, high-throughput sequencing technologies have made large-scale protein sequences accessible. However, their functional annotations usually rely on low-throughput and pricey experimental studies. Computational prediction models offer a promising alternative to accelerate t...

Descripción completa

Detalles Bibliográficos
Autores principales: Gu, Zhonghui, Luo, Xiao, Chen, Jiaxiao, Deng, Minghua, Lai, Luhua
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Oxford University Press 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10338137/
https://www.ncbi.nlm.nih.gov/pubmed/37369035
http://dx.doi.org/10.1093/bioinformatics/btad410
_version_ 1785071564090245120
author Gu, Zhonghui
Luo, Xiao
Chen, Jiaxiao
Deng, Minghua
Lai, Luhua
author_facet Gu, Zhonghui
Luo, Xiao
Chen, Jiaxiao
Deng, Minghua
Lai, Luhua
author_sort Gu, Zhonghui
collection PubMed
description MOTIVATION: In recent years, high-throughput sequencing technologies have made large-scale protein sequences accessible. However, their functional annotations usually rely on low-throughput and pricey experimental studies. Computational prediction models offer a promising alternative to accelerate this process. Graph neural networks have shown significant progress in protein research, but capturing long-distance structural correlations and identifying key residues in protein graphs remains challenging. RESULTS: In the present study, we propose a novel deep learning model named Hierarchical graph transformEr with contrAstive Learning (HEAL) for protein function prediction. The core feature of HEAL is its ability to capture structural semantics using a hierarchical graph Transformer, which introduces a range of super-nodes mimicking functional motifs to interact with nodes in the protein graph. These semantic-aware super-node embeddings are then aggregated with varying emphasis to produce a graph representation. To optimize the network, we utilized graph contrastive learning as a regularization technique to maximize the similarity between different views of the graph representation. Evaluation of the PDBch test set shows that HEAL-PDB, trained on fewer data, achieves comparable performance to the recent state-of-the-art methods, such as DeepFRI. Moreover, HEAL, with the added benefit of unresolved protein structures predicted by AlphaFold2, outperforms DeepFRI by a significant margin on Fmax, AUPR, and Smin metrics on PDBch test set. Additionally, when there are no experimentally resolved structures available for the proteins of interest, HEAL can still achieve better performance on AFch test set than DeepFRI and DeepGOPlus by taking advantage of AlphaFold2 predicted structures. Finally, HEAL is capable of finding functional sites through class activation mapping. AVAILABILITY AND IMPLEMENTATION: Implementations of our HEAL can be found at https://github.com/ZhonghuiGu/HEAL.
format Online
Article
Text
id pubmed-10338137
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Oxford University Press
record_format MEDLINE/PubMed
spelling pubmed-103381372023-07-13 Hierarchical graph transformer with contrastive learning for protein function prediction Gu, Zhonghui Luo, Xiao Chen, Jiaxiao Deng, Minghua Lai, Luhua Bioinformatics Original Paper MOTIVATION: In recent years, high-throughput sequencing technologies have made large-scale protein sequences accessible. However, their functional annotations usually rely on low-throughput and pricey experimental studies. Computational prediction models offer a promising alternative to accelerate this process. Graph neural networks have shown significant progress in protein research, but capturing long-distance structural correlations and identifying key residues in protein graphs remains challenging. RESULTS: In the present study, we propose a novel deep learning model named Hierarchical graph transformEr with contrAstive Learning (HEAL) for protein function prediction. The core feature of HEAL is its ability to capture structural semantics using a hierarchical graph Transformer, which introduces a range of super-nodes mimicking functional motifs to interact with nodes in the protein graph. These semantic-aware super-node embeddings are then aggregated with varying emphasis to produce a graph representation. To optimize the network, we utilized graph contrastive learning as a regularization technique to maximize the similarity between different views of the graph representation. Evaluation of the PDBch test set shows that HEAL-PDB, trained on fewer data, achieves comparable performance to the recent state-of-the-art methods, such as DeepFRI. Moreover, HEAL, with the added benefit of unresolved protein structures predicted by AlphaFold2, outperforms DeepFRI by a significant margin on Fmax, AUPR, and Smin metrics on PDBch test set. Additionally, when there are no experimentally resolved structures available for the proteins of interest, HEAL can still achieve better performance on AFch test set than DeepFRI and DeepGOPlus by taking advantage of AlphaFold2 predicted structures. Finally, HEAL is capable of finding functional sites through class activation mapping. AVAILABILITY AND IMPLEMENTATION: Implementations of our HEAL can be found at https://github.com/ZhonghuiGu/HEAL. Oxford University Press 2023-06-27 /pmc/articles/PMC10338137/ /pubmed/37369035 http://dx.doi.org/10.1093/bioinformatics/btad410 Text en © The Author(s) 2023. Published by Oxford University Press. https://creativecommons.org/licenses/by/4.0/This is an Open Access article distributed under the terms of the Creative Commons Attribution License (https://creativecommons.org/licenses/by/4.0/), which permits unrestricted reuse, distribution, and reproduction in any medium, provided the original work is properly cited.
spellingShingle Original Paper
Gu, Zhonghui
Luo, Xiao
Chen, Jiaxiao
Deng, Minghua
Lai, Luhua
Hierarchical graph transformer with contrastive learning for protein function prediction
title Hierarchical graph transformer with contrastive learning for protein function prediction
title_full Hierarchical graph transformer with contrastive learning for protein function prediction
title_fullStr Hierarchical graph transformer with contrastive learning for protein function prediction
title_full_unstemmed Hierarchical graph transformer with contrastive learning for protein function prediction
title_short Hierarchical graph transformer with contrastive learning for protein function prediction
title_sort hierarchical graph transformer with contrastive learning for protein function prediction
topic Original Paper
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10338137/
https://www.ncbi.nlm.nih.gov/pubmed/37369035
http://dx.doi.org/10.1093/bioinformatics/btad410
work_keys_str_mv AT guzhonghui hierarchicalgraphtransformerwithcontrastivelearningforproteinfunctionprediction
AT luoxiao hierarchicalgraphtransformerwithcontrastivelearningforproteinfunctionprediction
AT chenjiaxiao hierarchicalgraphtransformerwithcontrastivelearningforproteinfunctionprediction
AT dengminghua hierarchicalgraphtransformerwithcontrastivelearningforproteinfunctionprediction
AT lailuhua hierarchicalgraphtransformerwithcontrastivelearningforproteinfunctionprediction