Cargando…

Predicting Materials Properties with Little Data Using Shotgun Transfer Learning

[Image: see text] There is a growing demand for the use of machine learning (ML) to derive fast-to-evaluate surrogate models of materials properties. In recent years, a broad array of materials property databases have emerged as part of a digital transformation of materials science. However, recent...

Descripción completa

Detalles Bibliográficos
Autores principales: Yamada, Hironao, Liu, Chang, Wu, Stephen, Koyama, Yukinori, Ju, Shenghong, Shiomi, Junichiro, Morikawa, Junko, Yoshida, Ryo
Formato: Online Artículo Texto
Lenguaje:English
Publicado: American Chemical Society 2019
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6813555/
https://www.ncbi.nlm.nih.gov/pubmed/31660440
http://dx.doi.org/10.1021/acscentsci.9b00804
_version_ 1783462862367227904
author Yamada, Hironao
Liu, Chang
Wu, Stephen
Koyama, Yukinori
Ju, Shenghong
Shiomi, Junichiro
Morikawa, Junko
Yoshida, Ryo
author_facet Yamada, Hironao
Liu, Chang
Wu, Stephen
Koyama, Yukinori
Ju, Shenghong
Shiomi, Junichiro
Morikawa, Junko
Yoshida, Ryo
author_sort Yamada, Hironao
collection PubMed
description [Image: see text] There is a growing demand for the use of machine learning (ML) to derive fast-to-evaluate surrogate models of materials properties. In recent years, a broad array of materials property databases have emerged as part of a digital transformation of materials science. However, recent technological advances in ML are not fully exploited because of the insufficient volume and diversity of materials data. An ML framework called “transfer learning” has considerable potential to overcome the problem of limited amounts of materials data. Transfer learning relies on the concept that various property types, such as physical, chemical, electronic, thermodynamic, and mechanical properties, are physically interrelated. For a given target property to be predicted from a limited supply of training data, models of related proxy properties are pretrained using sufficient data; these models capture common features relevant to the target task. Repurposing of such machine-acquired features on the target task yields outstanding prediction performance even with exceedingly small data sets, as if highly experienced human experts can make rational inferences even for considerably less experienced tasks. In this study, to facilitate widespread use of transfer learning, we develop a pretrained model library called XenonPy.MDL. In this first release, the library comprises more than 140 000 pretrained models for various properties of small molecules, polymers, and inorganic crystalline materials. Along with these pretrained models, we describe some outstanding successes of transfer learning in different scenarios such as building models with only dozens of materials data, increasing the ability of extrapolative prediction through a strategic model transfer, and so on. Remarkably, transfer learning has autonomously identified rather nontrivial transferability across different properties transcending the different disciplines of materials science; for example, our analysis has revealed underlying bridges between small molecules and polymers and between organic and inorganic chemistry.
format Online
Article
Text
id pubmed-6813555
institution National Center for Biotechnology Information
language English
publishDate 2019
publisher American Chemical Society
record_format MEDLINE/PubMed
spelling pubmed-68135552019-10-28 Predicting Materials Properties with Little Data Using Shotgun Transfer Learning Yamada, Hironao Liu, Chang Wu, Stephen Koyama, Yukinori Ju, Shenghong Shiomi, Junichiro Morikawa, Junko Yoshida, Ryo ACS Cent Sci [Image: see text] There is a growing demand for the use of machine learning (ML) to derive fast-to-evaluate surrogate models of materials properties. In recent years, a broad array of materials property databases have emerged as part of a digital transformation of materials science. However, recent technological advances in ML are not fully exploited because of the insufficient volume and diversity of materials data. An ML framework called “transfer learning” has considerable potential to overcome the problem of limited amounts of materials data. Transfer learning relies on the concept that various property types, such as physical, chemical, electronic, thermodynamic, and mechanical properties, are physically interrelated. For a given target property to be predicted from a limited supply of training data, models of related proxy properties are pretrained using sufficient data; these models capture common features relevant to the target task. Repurposing of such machine-acquired features on the target task yields outstanding prediction performance even with exceedingly small data sets, as if highly experienced human experts can make rational inferences even for considerably less experienced tasks. In this study, to facilitate widespread use of transfer learning, we develop a pretrained model library called XenonPy.MDL. In this first release, the library comprises more than 140 000 pretrained models for various properties of small molecules, polymers, and inorganic crystalline materials. Along with these pretrained models, we describe some outstanding successes of transfer learning in different scenarios such as building models with only dozens of materials data, increasing the ability of extrapolative prediction through a strategic model transfer, and so on. Remarkably, transfer learning has autonomously identified rather nontrivial transferability across different properties transcending the different disciplines of materials science; for example, our analysis has revealed underlying bridges between small molecules and polymers and between organic and inorganic chemistry. American Chemical Society 2019-09-30 2019-10-23 /pmc/articles/PMC6813555/ /pubmed/31660440 http://dx.doi.org/10.1021/acscentsci.9b00804 Text en Copyright © 2019 American Chemical Society This is an open access article published under an ACS AuthorChoice License (http://pubs.acs.org/page/policy/authorchoice_termsofuse.html) , which permits copying and redistribution of the article or any adaptations for non-commercial purposes.
spellingShingle Yamada, Hironao
Liu, Chang
Wu, Stephen
Koyama, Yukinori
Ju, Shenghong
Shiomi, Junichiro
Morikawa, Junko
Yoshida, Ryo
Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title_full Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title_fullStr Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title_full_unstemmed Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title_short Predicting Materials Properties with Little Data Using Shotgun Transfer Learning
title_sort predicting materials properties with little data using shotgun transfer learning
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC6813555/
https://www.ncbi.nlm.nih.gov/pubmed/31660440
http://dx.doi.org/10.1021/acscentsci.9b00804
work_keys_str_mv AT yamadahironao predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT liuchang predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT wustephen predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT koyamayukinori predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT jushenghong predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT shiomijunichiro predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT morikawajunko predictingmaterialspropertieswithlittledatausingshotguntransferlearning
AT yoshidaryo predictingmaterialspropertieswithlittledatausingshotguntransferlearning