Cargando…

Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules

Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures. Training an accurate and comprehensive GCNN surrogate for molecular design requires large-scale graph da...

Descripción completa

Detalles Bibliográficos
Autores principales: Choi, Jong Youl, Zhang, Pei, Mehta, Kshitij, Blanchard, Andrew, Lupo Pasini, Massimiliano
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Springer International Publishing 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9575242/
https://www.ncbi.nlm.nih.gov/pubmed/36253845
http://dx.doi.org/10.1186/s13321-022-00652-1
_version_ 1784811275604197376
author Choi, Jong Youl
Zhang, Pei
Mehta, Kshitij
Blanchard, Andrew
Lupo Pasini, Massimiliano
author_facet Choi, Jong Youl
Zhang, Pei
Mehta, Kshitij
Blanchard, Andrew
Lupo Pasini, Massimiliano
author_sort Choi, Jong Youl
collection PubMed
description Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures. Training an accurate and comprehensive GCNN surrogate for molecular design requires large-scale graph datasets and is usually a time-consuming process. Recent advances in GPUs and distributed computing open a path to reduce the computational cost for GCNN training effectively. However, efficient utilization of high performance computing (HPC) resources for training requires simultaneously optimizing large-scale data management and scalable stochastic batched optimization techniques. In this work, we focus on building GCNN models on HPC systems to predict material properties of millions of molecules. We use HydraGNN, our in-house library for large-scale GCNN training, leveraging distributed data parallelism in PyTorch. We use ADIOS, a high-performance data management framework for efficient storage and reading of large molecular graph data. We perform parallel training on two open-source large-scale graph datasets to build a GCNN predictor for an important quantum property known as the HOMO-LUMO gap. We measure the scalability, accuracy, and convergence of our approach on two DOE supercomputers: the Summit supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and the Perlmutter system at the National Energy Research Scientific Computing Center (NERSC). We present our experimental results with HydraGNN showing (i) reduction of data loading time up to 4.2 times compared with a conventional method and (ii) linear scaling performance for training up to 1024 GPUs on both Summit and Perlmutter.
format Online
Article
Text
id pubmed-9575242
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher Springer International Publishing
record_format MEDLINE/PubMed
spelling pubmed-95752422022-10-18 Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules Choi, Jong Youl Zhang, Pei Mehta, Kshitij Blanchard, Andrew Lupo Pasini, Massimiliano J Cheminform Research Graph Convolutional Neural Network (GCNN) is a popular class of deep learning (DL) models in material science to predict material properties from the graph representation of molecular structures. Training an accurate and comprehensive GCNN surrogate for molecular design requires large-scale graph datasets and is usually a time-consuming process. Recent advances in GPUs and distributed computing open a path to reduce the computational cost for GCNN training effectively. However, efficient utilization of high performance computing (HPC) resources for training requires simultaneously optimizing large-scale data management and scalable stochastic batched optimization techniques. In this work, we focus on building GCNN models on HPC systems to predict material properties of millions of molecules. We use HydraGNN, our in-house library for large-scale GCNN training, leveraging distributed data parallelism in PyTorch. We use ADIOS, a high-performance data management framework for efficient storage and reading of large molecular graph data. We perform parallel training on two open-source large-scale graph datasets to build a GCNN predictor for an important quantum property known as the HOMO-LUMO gap. We measure the scalability, accuracy, and convergence of our approach on two DOE supercomputers: the Summit supercomputer at the Oak Ridge Leadership Computing Facility (OLCF) and the Perlmutter system at the National Energy Research Scientific Computing Center (NERSC). We present our experimental results with HydraGNN showing (i) reduction of data loading time up to 4.2 times compared with a conventional method and (ii) linear scaling performance for training up to 1024 GPUs on both Summit and Perlmutter. Springer International Publishing 2022-10-17 /pmc/articles/PMC9575242/ /pubmed/36253845 http://dx.doi.org/10.1186/s13321-022-00652-1 Text en © The Author(s) 2022 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) . The Creative Commons Public Domain Dedication waiver (http://creativecommons.org/publicdomain/zero/1.0/ (https://creativecommons.org/publicdomain/zero/1.0/) ) applies to the data made available in this article, unless otherwise stated in a credit line to the data.
spellingShingle Research
Choi, Jong Youl
Zhang, Pei
Mehta, Kshitij
Blanchard, Andrew
Lupo Pasini, Massimiliano
Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title_full Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title_fullStr Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title_full_unstemmed Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title_short Scalable training of graph convolutional neural networks for fast and accurate predictions of HOMO-LUMO gap in molecules
title_sort scalable training of graph convolutional neural networks for fast and accurate predictions of homo-lumo gap in molecules
topic Research
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9575242/
https://www.ncbi.nlm.nih.gov/pubmed/36253845
http://dx.doi.org/10.1186/s13321-022-00652-1
work_keys_str_mv AT choijongyoul scalabletrainingofgraphconvolutionalneuralnetworksforfastandaccuratepredictionsofhomolumogapinmolecules
AT zhangpei scalabletrainingofgraphconvolutionalneuralnetworksforfastandaccuratepredictionsofhomolumogapinmolecules
AT mehtakshitij scalabletrainingofgraphconvolutionalneuralnetworksforfastandaccuratepredictionsofhomolumogapinmolecules
AT blanchardandrew scalabletrainingofgraphconvolutionalneuralnetworksforfastandaccuratepredictionsofhomolumogapinmolecules
AT lupopasinimassimiliano scalabletrainingofgraphconvolutionalneuralnetworksforfastandaccuratepredictionsofhomolumogapinmolecules