Cargando…

ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction

Graph convolutional neural networks (GCNs) have been repeatedly shown to have robust capacities for modeling graph data such as small molecules. Message-passing neural networks (MPNNs), a group of GCN variants that can learn and aggregate local information of molecules through iterative message-pass...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Chengyou, Sun, Yan, Davis, Rebecca, Cardona, Silvia T., Hu, Pingzhao
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9968697/
https://www.ncbi.nlm.nih.gov/pubmed/36843022
http://dx.doi.org/10.1186/s13321-023-00698-9
_version_ 1784897554860736512
author Liu, Chengyou
Sun, Yan
Davis, Rebecca
Cardona, Silvia T.
Hu, Pingzhao
author_facet Liu, Chengyou
Sun, Yan
Davis, Rebecca
Cardona, Silvia T.
Hu, Pingzhao
author_sort Liu, Chengyou
collection PubMed
description Graph convolutional neural networks (GCNs) have been repeatedly shown to have robust capacities for modeling graph data such as small molecules. Message-passing neural networks (MPNNs), a group of GCN variants that can learn and aggregate local information of molecules through iterative message-passing iterations, have exhibited advancements in molecular modeling and property prediction. Moreover, given the merits of Transformers in multiple artificial intelligence domains, it is desirable to combine the self-attention mechanism with MPNNs for better molecular representation. We propose an atom-bond transformer-based message-passing neural network (ABT-MPNN), to improve the molecular representation embedding process for molecular property predictions. By designing corresponding attention mechanisms in the message-passing and readout phases of the MPNN, our method provides a novel architecture that integrates molecular representations at the bond, atom and molecule levels in an end-to-end way. The experimental results across nine datasets show that the proposed ABT-MPNN outperforms or is comparable to the state-of-the-art baseline models in quantitative structure–property relationship tasks. We provide case examples of Mycobacterium tuberculosis growth inhibitors and demonstrate that our model's visualization modality of attention at the atomic level could be an insightful way to investigate molecular atoms or functional groups associated with desired biological properties. The new model provides an innovative way to investigate the effect of self-attention on chemical substructures and functional groups in molecular representation learning, which increases the interpretability of the traditional MPNN and can serve as a valuable way to investigate the mechanism of action of drugs. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13321-023-00698-9.
format Online
Article
Text
id pubmed-9968697
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-99686972023-02-28 ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction Liu, Chengyou Sun, Yan Davis, Rebecca Cardona, Silvia T. Hu, Pingzhao J Cheminform Research Graph convolutional neural networks (GCNs) have been repeatedly shown to have robust capacities for modeling graph data such as small molecules. Message-passing neural networks (MPNNs), a group of GCN variants that can learn and aggregate local information of molecules through iterative message-passing iterations, have exhibited advancements in molecular modeling and property prediction. Moreover, given the merits of Transformers in multiple artificial intelligence domains, it is desirable to combine the self-attention mechanism with MPNNs for better molecular representation. We propose an atom-bond transformer-based message-passing neural network (ABT-MPNN), to improve the molecular representation embedding process for molecular property predictions. By designing corresponding attention mechanisms in the message-passing and readout phases of the MPNN, our method provides a novel architecture that integrates molecular representations at the bond, atom and molecule levels in an end-to-end way. The experimental results across nine datasets show that the proposed ABT-MPNN outperforms or is comparable to the state-of-the-art baseline models in quantitative structure–property relationship tasks. We provide case examples of Mycobacterium tuberculosis growth inhibitors and demonstrate that our model's visualization modality of attention at the atomic level could be an insightful way to investigate molecular atoms or functional groups associated with desired biological properties. The new model provides an innovative way to investigate the effect of self-attention on chemical substructures and functional groups in molecular representation learning, which increases the interpretability of the traditional MPNN and can serve as a valuable way to investigate the mechanism of action of drugs. SUPPLEMENTARY INFORMATION: The online version contains supplementary material available at 10.1186/s13321-023-00698-9. Springer International Publishing 2023-02-26 /pmc/articles/PMC9968697/ /pubmed/36843022 http://dx.doi.org/10.1186/s13321-023-00698-9 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Liu, Chengyou
Sun, Yan
Davis, Rebecca
Cardona, Silvia T.
Hu, Pingzhao
ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title_full ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title_fullStr ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title_full_unstemmed ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title_short ABT-MPNN: an atom-bond transformer-based message-passing neural network for molecular property prediction
title_sort abt-mpnn: an atom-bond transformer-based message-passing neural network for molecular property prediction
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9968697/
https://www.ncbi.nlm.nih.gov/pubmed/36843022
http://dx.doi.org/10.1186/s13321-023-00698-9
work_keys_str_mv AT liuchengyou abtmpnnanatombondtransformerbasedmessagepassingneuralnetworkformolecularpropertyprediction
AT sunyan abtmpnnanatombondtransformerbasedmessagepassingneuralnetworkformolecularpropertyprediction
AT davisrebecca abtmpnnanatombondtransformerbasedmessagepassingneuralnetworkformolecularpropertyprediction
AT cardonasilviat abtmpnnanatombondtransformerbasedmessagepassingneuralnetworkformolecularpropertyprediction
AT hupingzhao abtmpnnanatombondtransformerbasedmessagepassingneuralnetworkformolecularpropertyprediction