Cargando…

Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT

Deep neural networks can directly learn from chemical structures without extensive, user-driven selection of descriptors in order to predict molecular properties/activities with high reliability. But these approaches typically require large training sets to learn the endpoint-specific structural fea...

Descripción completa

Detalles Bibliográficos
Autores principales: Li, Xinhao, Fourches, Denis
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2020
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7178569/
https://www.ncbi.nlm.nih.gov/pubmed/33430978
http://dx.doi.org/10.1186/s13321-020-00430-x
_version_ 1783525485423099904
author Li, Xinhao
Fourches, Denis
author_facet Li, Xinhao
Fourches, Denis
author_sort Li, Xinhao
collection PubMed
description Deep neural networks can directly learn from chemical structures without extensive, user-driven selection of descriptors in order to predict molecular properties/activities with high reliability. But these approaches typically require large training sets to learn the endpoint-specific structural features and ensure reasonable prediction accuracy. Even though large datasets are becoming the new normal in drug discovery, especially when it comes to high-throughput screening or metabolomics datasets, one should also consider smaller datasets with challenging endpoints to model and forecast. Thus, it would be highly relevant to better utilize the tremendous compendium of unlabeled compounds from publicly-available datasets for improving the model performances for the user’s particular series of compounds. In this study, we propose the Molecular Prediction Model Fine-Tuning (MolPMoFiT) approach, an effective transfer learning method based on self-supervised pre-training + task-specific fine-tuning for QSPR/QSAR modeling. A large-scale molecular structure prediction model is pre-trained using one million unlabeled molecules from ChEMBL in a self-supervised learning manner, and can then be fine-tuned on various QSPR/QSAR tasks for smaller chemical datasets with specific endpoints. Herein, the method is evaluated on four benchmark datasets (lipophilicity, FreeSolv, HIV, and blood–brain barrier penetration). The results showed the method can achieve strong performances for all four datasets compared to other state-of-the-art machine learning modeling techniques reported in the literature so far. [Image: see text]
format Online
Article
Text
id pubmed-7178569
institution National Center for Biotechnology Information
language English
publishDate 2020
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-71785692020-04-24 Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT Li, Xinhao Fourches, Denis J Cheminform Research Article Deep neural networks can directly learn from chemical structures without extensive, user-driven selection of descriptors in order to predict molecular properties/activities with high reliability. But these approaches typically require large training sets to learn the endpoint-specific structural features and ensure reasonable prediction accuracy. Even though large datasets are becoming the new normal in drug discovery, especially when it comes to high-throughput screening or metabolomics datasets, one should also consider smaller datasets with challenging endpoints to model and forecast. Thus, it would be highly relevant to better utilize the tremendous compendium of unlabeled compounds from publicly-available datasets for improving the model performances for the user’s particular series of compounds. In this study, we propose the Molecular Prediction Model Fine-Tuning (MolPMoFiT) approach, an effective transfer learning method based on self-supervised pre-training + task-specific fine-tuning for QSPR/QSAR modeling. A large-scale molecular structure prediction model is pre-trained using one million unlabeled molecules from ChEMBL in a self-supervised learning manner, and can then be fine-tuned on various QSPR/QSAR tasks for smaller chemical datasets with specific endpoints. Herein, the method is evaluated on four benchmark datasets (lipophilicity, FreeSolv, HIV, and blood–brain barrier penetration). The results showed the method can achieve strong performances for all four datasets compared to other state-of-the-art machine learning modeling techniques reported in the literature so far. [Image: see text] Springer International Publishing 2020-04-22 /pmc/articles/PMC7178569/ /pubmed/33430978 http://dx.doi.org/10.1186/s13321-020-00430-x Text en © The Author(s) 2020 Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/. The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research Article
Li, Xinhao
Fourches, Denis
Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title_full Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title_fullStr Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title_full_unstemmed Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title_short Inductive transfer learning for molecular activity prediction: Next-Gen QSAR Models with MolPMoFiT
title_sort inductive transfer learning for molecular activity prediction: next-gen qsar models with molpmofit
topic Research Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7178569/
https://www.ncbi.nlm.nih.gov/pubmed/33430978
http://dx.doi.org/10.1186/s13321-020-00430-x
work_keys_str_mv AT lixinhao inductivetransferlearningformolecularactivitypredictionnextgenqsarmodelswithmolpmofit
AT fourchesdenis inductivetransferlearningformolecularactivitypredictionnextgenqsarmodelswithmolpmofit