published in Nature in November 2023
paperAuthor
Credibility Rating
Gold standard. Rigorous peer review, high editorial standards, and strong institutional reputation.
Rating inherited from publication venue: Nature
Demonstrates how large-scale graph neural networks can accelerate materials discovery, relevant to AI safety research on AI capability scaling, generalization, and real-world applications of deep learning systems.
Paper Details
Metadata
Summary
This Nature paper demonstrates that graph neural networks trained at scale can dramatically accelerate materials discovery by achieving unprecedented generalization capabilities. The researchers used deep learning models trained on 48,000 stable crystals to discover 2.2 million new stable crystal structures, representing an order-of-magnitude expansion in known materials. The work also produces highly accurate learned interatomic potentials for molecular dynamics simulations and ionic conductivity prediction, with 736 of the discovered structures already experimentally validated, enabling rapid screening for applications in clean energy and information processing.
Cited by 1 page
| Page | Type | Quality |
|---|---|---|
| Scientific Research Capabilities | Capability | 68.0 |
Cached Content Preview
Scaling deep learning for materials discovery | Nature
Skip to main content
Thank you for visiting nature.com. You are using a browser version with limited support for CSS. To obtain
the best experience, we recommend you use a more up to date browser (or turn off compatibility mode in
Internet Explorer). In the meantime, to ensure continued support, we are displaying the site without styles
and JavaScript.
Advertisement
Scaling deep learning for materials discovery
Download PDF
Download PDF
Subjects
Computer science
Scaling laws
Abstract
Novel functional materials enable fundamental breakthroughs across technological applications from clean energy to information processing 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 . From microchips to batteries and photovoltaics, discovery of inorganic crystals has been bottlenecked by expensive trial-and-error approaches. Concurrently, deep-learning models for language, vision and biology have showcased emergent predictive capabilities with increasing data and computation 12 , 13 , 14 . Here we show that graph networks trained at scale can reach unprecedented levels of generalization, improving the efficiency of materials discovery by an order of magnitude. Building on 48,000 stable crystals identified in continuing studies 15 , 16 , 17 , improved efficiency enables the discovery of 2.2 million structures below the current convex hull, many of which escaped previous human chemical intuition. Our work represents an order-of-magnitude expansion in stable materials known to humanity. Stable discoveries that are on the final convex hull will be made available to screen for technological applications, as we demonstrate for layered materials and solid-electrolyte candidates. Of the stable structures, 736 have already been independently experimentally realized. The scale and diversity of hundreds of millions of first-principles calculations also unlock modelling capabilities for downstream applications, leading in particular to highly accurate and robust learned interatomic potentials that can be used in condensed-phase molecular-dynamics simulations and high-fidelity zero-shot prediction of ionic conductivity.
Similar content being viewed by others
A framework to evaluate machine learning crystal stability pr
... (truncated, 83 KB total)fab26d57329d2e8d | Stable ID: YTU2ZTVhZD