Cargando…

Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery

The success of machine learning (ML) in materials property prediction depends heavily on how the materials are represented for learning. Two dominant families of material descriptors exist, one that encodes crystal structure in the representation and the other that only uses stoichiometric informati...

Descripción completa

Detalles Bibliográficos
Autores principales: Ihalage, Achintha, Hao, Yang
Formato: Online Artículo Texto
Lenguaje:English
Publicado: John Wiley and Sons Inc. 2022
Materias:
Acceso en línea:https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9218748/
https://www.ncbi.nlm.nih.gov/pubmed/35475548
http://dx.doi.org/10.1002/advs.202200164
_version_ 1784731960851038208
author Ihalage, Achintha
Hao, Yang
author_facet Ihalage, Achintha
Hao, Yang
author_sort Ihalage, Achintha
collection PubMed
description The success of machine learning (ML) in materials property prediction depends heavily on how the materials are represented for learning. Two dominant families of material descriptors exist, one that encodes crystal structure in the representation and the other that only uses stoichiometric information with the hope of discovering new materials. Graph neural networks (GNNs) in particular have excelled in predicting material properties within chemical accuracy. However, current GNNs are limited to only one of the above two avenues owing to the little overlap between respective material representations. Here, a new concept of formula graph which unifies stoichiometry‐only and structure‐based material descriptors is introduced. A self‐attention integrated GNN that assimilates a formula graph is further developed and it is found that the proposed architecture produces material embeddings transferable between the two domains. The proposed model can outperform some previously reported structure‐agnostic models and their structure‐based counterparts while exhibiting better sample efficiency and faster convergence. Finally, the model is applied in a challenging exemplar to predict the complex dielectric function of materials and nominate new substances that potentially exhibit epsilon‐near‐zero phenomena.
format Online
Article
Text
id pubmed-9218748
institution National Center for Biotechnology Information
language English
publishDate 2022
publisher John Wiley and Sons Inc.
record_format MEDLINE/PubMed
spelling pubmed-92187482022-06-29 Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery Ihalage, Achintha Hao, Yang Adv Sci (Weinh) Research Articles The success of machine learning (ML) in materials property prediction depends heavily on how the materials are represented for learning. Two dominant families of material descriptors exist, one that encodes crystal structure in the representation and the other that only uses stoichiometric information with the hope of discovering new materials. Graph neural networks (GNNs) in particular have excelled in predicting material properties within chemical accuracy. However, current GNNs are limited to only one of the above two avenues owing to the little overlap between respective material representations. Here, a new concept of formula graph which unifies stoichiometry‐only and structure‐based material descriptors is introduced. A self‐attention integrated GNN that assimilates a formula graph is further developed and it is found that the proposed architecture produces material embeddings transferable between the two domains. The proposed model can outperform some previously reported structure‐agnostic models and their structure‐based counterparts while exhibiting better sample efficiency and faster convergence. Finally, the model is applied in a challenging exemplar to predict the complex dielectric function of materials and nominate new substances that potentially exhibit epsilon‐near‐zero phenomena. John Wiley and Sons Inc. 2022-04-27 /pmc/articles/PMC9218748/ /pubmed/35475548 http://dx.doi.org/10.1002/advs.202200164 Text en © 2022 The Authors. Advanced Science published by Wiley‐VCH GmbH https://creativecommons.org/licenses/by/4.0/This is an open access article under the terms of the http://creativecommons.org/licenses/by/4.0/ (https://creativecommons.org/licenses/by/4.0/) License, which permits use, distribution and reproduction in any medium, provided the original work is properly cited.
spellingShingle Research Articles
Ihalage, Achintha
Hao, Yang
Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title_full Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title_fullStr Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title_full_unstemmed Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title_short Formula Graph Self‐Attention Network for Representation‐Domain Independent Materials Discovery
title_sort formula graph self‐attention network for representation‐domain independent materials discovery
topic Research Articles
url https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9218748/
https://www.ncbi.nlm.nih.gov/pubmed/35475548
http://dx.doi.org/10.1002/advs.202200164
work_keys_str_mv AT ihalageachintha formulagraphselfattentionnetworkforrepresentationdomainindependentmaterialsdiscovery
AT haoyang formulagraphselfattentionnetworkforrepresentationdomainindependentmaterialsdiscovery