Cargando…

Improving deep learning model performance under parametric constraints for materials informatics applications

Modern machine learning (ML) and deep learning (DL) techniques using high-dimensional data representations have helped accelerate the materials discovery process by efficiently detecting hidden patterns in existing datasets and linking input representations to output properties for a better understa...

Descripción completa

Detalles Bibliográficos
Autores principales: Gupta, Vishu, Peltekian, Alec, Liao, Wei-keng, Choudhary, Alok, Agrawal, Ankit
Formato: Online Artículo Texto
Lenguaje:English
Publicado: Nature Publishing Group UK 2023
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10241826/
https://www.ncbi.nlm.nih.gov/pubmed/37277456
http://dx.doi.org/10.1038/s41598-023-36336-5
_version_ 1785054075093516288
author Gupta, Vishu
Peltekian, Alec
Liao, Wei-keng
Choudhary, Alok
Agrawal, Ankit
author_facet Gupta, Vishu
Peltekian, Alec
Liao, Wei-keng
Choudhary, Alok
Agrawal, Ankit
author_sort Gupta, Vishu
collection PubMed
description Modern machine learning (ML) and deep learning (DL) techniques using high-dimensional data representations have helped accelerate the materials discovery process by efficiently detecting hidden patterns in existing datasets and linking input representations to output properties for a better understanding of the scientific phenomenon. While a deep neural network comprised of fully connected layers has been widely used for materials property prediction, simply creating a deeper model with a large number of layers often faces with vanishing gradient problem, causing a degradation in the performance, thereby limiting usage. In this paper, we study and propose architectural principles to address the question of improving the performance of model training and inference under fixed parametric constraints. Here, we present a general deep-learning framework based on branched residual learning (BRNet) with fully connected layers that can work with any numerical vector-based representation as input to build accurate models to predict materials properties. We perform model training for materials properties using numerical vectors representing different composition-based attributes of the respective materials and compare the performance of the proposed models against traditional ML and existing DL architectures. We find that the proposed models are significantly more accurate than the ML/DL models for all data sizes by using different composition-based attributes as input. Further, branched learning requires fewer parameters and results in faster model training due to better convergence during the training phase than existing neural networks, thereby efficiently building accurate models for predicting materials properties.
format Online
Article
Text
id pubmed-10241826
institution National Center for Biotechnology Information
language English
publishDate 2023
publisher Nature Publishing Group UK
record_format MEDLINE/PubMed
spelling pubmed-102418262023-06-07 Improving deep learning model performance under parametric constraints for materials informatics applications Gupta, Vishu Peltekian, Alec Liao, Wei-keng Choudhary, Alok Agrawal, Ankit Sci Rep Article Modern machine learning (ML) and deep learning (DL) techniques using high-dimensional data representations have helped accelerate the materials discovery process by efficiently detecting hidden patterns in existing datasets and linking input representations to output properties for a better understanding of the scientific phenomenon. While a deep neural network comprised of fully connected layers has been widely used for materials property prediction, simply creating a deeper model with a large number of layers often faces with vanishing gradient problem, causing a degradation in the performance, thereby limiting usage. In this paper, we study and propose architectural principles to address the question of improving the performance of model training and inference under fixed parametric constraints. Here, we present a general deep-learning framework based on branched residual learning (BRNet) with fully connected layers that can work with any numerical vector-based representation as input to build accurate models to predict materials properties. We perform model training for materials properties using numerical vectors representing different composition-based attributes of the respective materials and compare the performance of the proposed models against traditional ML and existing DL architectures. We find that the proposed models are significantly more accurate than the ML/DL models for all data sizes by using different composition-based attributes as input. Further, branched learning requires fewer parameters and results in faster model training due to better convergence during the training phase than existing neural networks, thereby efficiently building accurate models for predicting materials properties. Nature Publishing Group UK 2023-06-05 /pmc/articles/PMC10241826/ /pubmed/37277456 http://dx.doi.org/10.1038/s41598-023-36336-5 Text en © The Author(s) 2023 https://creativecommons.org/licenses/by/4.0/Open AccessThis article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article's Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article's Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) .
spellingShingle Article
Gupta, Vishu
Peltekian, Alec
Liao, Wei-keng
Choudhary, Alok
Agrawal, Ankit
Improving deep learning model performance under parametric constraints for materials informatics applications
title Improving deep learning model performance under parametric constraints for materials informatics applications
title_full Improving deep learning model performance under parametric constraints for materials informatics applications
title_fullStr Improving deep learning model performance under parametric constraints for materials informatics applications
title_full_unstemmed Improving deep learning model performance under parametric constraints for materials informatics applications
title_short Improving deep learning model performance under parametric constraints for materials informatics applications
title_sort improving deep learning model performance under parametric constraints for materials informatics applications
topic Article
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10241826/
https://www.ncbi.nlm.nih.gov/pubmed/37277456
http://dx.doi.org/10.1038/s41598-023-36336-5
work_keys_str_mv AT guptavishu improvingdeeplearningmodelperformanceunderparametricconstraintsformaterialsinformaticsapplications
AT peltekianalec improvingdeeplearningmodelperformanceunderparametricconstraintsformaterialsinformaticsapplications
AT liaoweikeng improvingdeeplearningmodelperformanceunderparametricconstraintsformaterialsinformaticsapplications
AT choudharyalok improvingdeeplearningmodelperformanceunderparametricconstraintsformaterialsinformaticsapplications
AT agrawalankit improvingdeeplearningmodelperformanceunderparametricconstraintsformaterialsinformaticsapplications