Crystal graph attention networks for the prediction of stable materials

Jonathan Schmidt, Love Pettersson, Claudio Verdozzi, Silvana Botti, Miguel A. L. Marques

Research output: Contribution to journalArticlepeer-review

Abstract

Graph neural networks for crystal structures typically use the atomic positions and the atomic species as input. Unfortunately, this information is not available when predicting new materials, for which the precise geometrical information is unknown. We circumvent this problem by replacing the precise bond distances with embeddings of graph distances. This allows our networks to be applied directly in high-throughput studies based on both composition and crystal structure prototype without using relaxed structures as input. To train these networks, we curate a dataset of over 2 million density functional calculations of crystals with consistent calculation param-eters. We apply the resulting model to the high-throughput search of 15 million tetragonal perovskites of compo-sition ABCD2. As a result, we identify several thousand potentially stable compounds and demonstrate that transfer learning from the newly curated dataset reduces the required training data by 50%.
Original languageEnglish
Article number7948
Number of pages11
JournalScience Advances
Volume7
Issue number49
DOIs
Publication statusPublished - 2021 Dec 3

Subject classification (UKÄ)

  • Condensed Matter Physics

Fingerprint

Dive into the research topics of 'Crystal graph attention networks for the prediction of stable materials'. Together they form a unique fingerprint.

Cite this