Cargando…

Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding

BACKGROUND: Drug discovery is time-consuming and costly. Machine learning, especially deep learning, shows great potential in quantitative structure–activity relationship (QSAR) modeling to accelerate drug discovery process and reduce its cost. A big challenge in developing robust and generalizable...

Descripción completa

Detalles Bibliográficos
Autores principales: Liu, Yang, Lim, Hansaim, Xie, Lei
Formato: Online Artículo Texto
Lenguaje:English
Publicado: BioMed Central 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9063120/
https://www.ncbi.nlm.nih.gov/pubmed/35501680
http://dx.doi.org/10.1186/s12859-022-04681-3
_version_ 1784699099405090816
author Liu, Yang
Lim, Hansaim
Xie, Lei
author_facet Liu, Yang
Lim, Hansaim
Xie, Lei
author_sort Liu, Yang
collection PubMed
description BACKGROUND: Drug discovery is time-consuming and costly. Machine learning, especially deep learning, shows great potential in quantitative structure–activity relationship (QSAR) modeling to accelerate drug discovery process and reduce its cost. A big challenge in developing robust and generalizable deep learning models for QSAR is the lack of a large amount of data with high-quality and balanced labels. To address this challenge, we developed a self-training method, Partially LAbeled Noisy Student (PLANS), and a novel self-supervised graph embedding, Graph-Isomorphism-Network Fingerprint (GINFP), for chemical compounds representations with substructure information using unlabeled data. The representations can be used for predicting chemical properties such as binding affinity, toxicity, and others. PLANS-GINFP allows us to exploit millions of unlabeled chemical compounds as well as labeled and partially labeled pharmacological data to improve the generalizability of neural network models. RESULTS: We evaluated the performance of PLANS-GINFP for predicting Cytochrome P450 (CYP450) binding activity in a CYP450 dataset and chemical toxicity in the Tox21 dataset. The extensive benchmark studies demonstrated that PLANS-GINFP could significantly improve the performance in both cases by a large margin. Both PLANS-based self-training and GINFP-based self-supervised learning contribute to the performance improvement. CONCLUSION: To better exploit chemical structures as an input for machine learning algorithms, we proposed a self-supervised graph neural network-based embedding method that can encode substructure information. Furthermore, we developed a model agnostic self-training method, PLANS, that can be applied to any deep learning architectures to improve prediction accuracies. PLANS provided a way to better utilize partially labeled and unlabeled data. Comprehensive benchmark studies demonstrated their potentials in predicting drug metabolism and toxicity profiles using sparse, noisy, and imbalanced data. PLANS-GINFP could serve as a general solution to improve the predictive modeling for QSAR modeling.
format Online
Article
Text
id pubmed-9063120
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher BioMed Central
record_format MEDLINE/PubMed
spelling pubmed-90631202022-05-04 Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding Liu, Yang Lim, Hansaim Xie, Lei BMC Bioinformatics Research BACKGROUND: Drug discovery is time-consuming and costly. Machine learning, especially deep learning, shows great potential in quantitative structure–activity relationship (QSAR) modeling to accelerate drug discovery process and reduce its cost. A big challenge in developing robust and generalizable deep learning models for QSAR is the lack of a large amount of data with high-quality and balanced labels. To address this challenge, we developed a self-training method, Partially LAbeled Noisy Student (PLANS), and a novel self-supervised graph embedding, Graph-Isomorphism-Network Fingerprint (GINFP), for chemical compounds representations with substructure information using unlabeled data. The representations can be used for predicting chemical properties such as binding affinity, toxicity, and others. PLANS-GINFP allows us to exploit millions of unlabeled chemical compounds as well as labeled and partially labeled pharmacological data to improve the generalizability of neural network models. RESULTS: We evaluated the performance of PLANS-GINFP for predicting Cytochrome P450 (CYP450) binding activity in a CYP450 dataset and chemical toxicity in the Tox21 dataset. The extensive benchmark studies demonstrated that PLANS-GINFP could significantly improve the performance in both cases by a large margin. Both PLANS-based self-training and GINFP-based self-supervised learning contribute to the performance improvement. CONCLUSION: To better exploit chemical structures as an input for machine learning algorithms, we proposed a self-supervised graph neural network-based embedding method that can encode substructure information. Furthermore, we developed a model agnostic self-training method, PLANS, that can be applied to any deep learning architectures to improve prediction accuracies. PLANS provided a way to better utilize partially labeled and unlabeled data. Comprehensive benchmark studies demonstrated their potentials in predicting drug metabolism and toxicity profiles using sparse, noisy, and imbalanced data. PLANS-GINFP could serve as a general solution to improve the predictive modeling for QSAR modeling. BioMed Central 2022-05-02 /pmc/articles/PMC9063120/ /pubmed/35501680 http://dx.doi.org/10.1186/s12859-022-04681-3 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Liu, Yang
Lim, Hansaim
Xie, Lei
Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title_full Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title_fullStr Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title_full_unstemmed Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title_short Exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
title_sort exploration of chemical space with partial labeled noisy student self-training and self-supervised graph embedding
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9063120/
https://www.ncbi.nlm.nih.gov/pubmed/35501680
http://dx.doi.org/10.1186/s12859-022-04681-3
work_keys_str_mv AT liuyang explorationofchemicalspacewithpartiallabelednoisystudentselftrainingandselfsupervisedgraphembedding
AT limhansaim explorationofchemicalspacewithpartiallabelednoisystudentselftrainingandselfsupervisedgraphembedding
AT xielei explorationofchemicalspacewithpartiallabelednoisystudentselftrainingandselfsupervisedgraphembedding