Jason B. Gibson, Ajinkya C. Hire, Philip M. Dee, Oscar Barrera, Benjamin Geisler, Peter J. Hirschfeld, Richard G. Hennig
{"title":"","authors":"Jason B. Gibson, Ajinkya C. Hire, Philip M. Dee, Oscar Barrera, Benjamin Geisler, Peter J. Hirschfeld, Richard G. Hennig","doi":"10.1038/s41524-024-01475-4","DOIUrl":null,"url":null,"abstract":"<p>Integrating deep learning with the search for new electron-phonon superconductors represents a burgeoning field of research, where the primary challenge lies in the computational intensity of calculating the electron-phonon spectral function, <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>), the essential ingredient of Midgal-Eliashberg theory of superconductivity. To overcome this challenge, we adopt a two-step approach. First, we compute <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>) for 818 dynamically stable materials. We then train a deep-learning model to predict <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>), using a training strategy tailored for limited data to temper the model’s overfitting, enhancing predictions. Specifically, we train a Bootstrapped Ensemble of Tempered Equivariant graph neural NETworks (BETE-NET), obtaining an MAE of 0.21, 45 K, and 43 K for the moments derived from <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>): <i>λ</i>, <span>\\({\\omega }_{\\log }\\)</span>, and <i>ω</i><sub>2</sub>, respectively, yielding an MAE of 2.5 K for the critical temperature, <i>T</i><sub><i>c</i></sub>. Further, we incorporate domain knowledge of the site-projected phonon density of states to impose inductive bias into the model’s node attributes and enhance predictions. This methodological innovation decreases the MAE to 0.18, 29 K, and 28 K, respectively, yielding an MAE of 2.1 K for <i>T</i><sub><i>c</i></sub>. We illustrate the practical application of our model in high-throughput screening for high-<i>T</i><sub>c</sub> materials. The model demonstrates an average precision nearly five times higher than random screening, highlighting the potential of ML in accelerating superconductor discovery. BETE-NET accelerates the search for high-<i>T</i><sub>c</sub> superconductors while setting a precedent for applying ML in materials discovery, particularly when data is limited.</p>","PeriodicalId":19342,"journal":{"name":"npj Computational Materials","volume":"61 1","pages":""},"PeriodicalIF":9.4000,"publicationDate":"2025-01-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Accelerating superconductor discovery through tempered deep learning of the electron-phonon spectral function\",\"authors\":\"Jason B. Gibson, Ajinkya C. Hire, Philip M. Dee, Oscar Barrera, Benjamin Geisler, Peter J. Hirschfeld, Richard G. Hennig\",\"doi\":\"10.1038/s41524-024-01475-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Integrating deep learning with the search for new electron-phonon superconductors represents a burgeoning field of research, where the primary challenge lies in the computational intensity of calculating the electron-phonon spectral function, <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>), the essential ingredient of Midgal-Eliashberg theory of superconductivity. To overcome this challenge, we adopt a two-step approach. First, we compute <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>) for 818 dynamically stable materials. We then train a deep-learning model to predict <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>), using a training strategy tailored for limited data to temper the model’s overfitting, enhancing predictions. Specifically, we train a Bootstrapped Ensemble of Tempered Equivariant graph neural NETworks (BETE-NET), obtaining an MAE of 0.21, 45 K, and 43 K for the moments derived from <i>α</i><sup>2</sup><i>F</i>(<i>ω</i>): <i>λ</i>, <span>\\\\({\\\\omega }_{\\\\log }\\\\)</span>, and <i>ω</i><sub>2</sub>, respectively, yielding an MAE of 2.5 K for the critical temperature, <i>T</i><sub><i>c</i></sub>. Further, we incorporate domain knowledge of the site-projected phonon density of states to impose inductive bias into the model’s node attributes and enhance predictions. This methodological innovation decreases the MAE to 0.18, 29 K, and 28 K, respectively, yielding an MAE of 2.1 K for <i>T</i><sub><i>c</i></sub>. We illustrate the practical application of our model in high-throughput screening for high-<i>T</i><sub>c</sub> materials. The model demonstrates an average precision nearly five times higher than random screening, highlighting the potential of ML in accelerating superconductor discovery. BETE-NET accelerates the search for high-<i>T</i><sub>c</sub> superconductors while setting a precedent for applying ML in materials discovery, particularly when data is limited.</p>\",\"PeriodicalId\":19342,\"journal\":{\"name\":\"npj Computational Materials\",\"volume\":\"61 1\",\"pages\":\"\"},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2025-01-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"npj Computational Materials\",\"FirstCategoryId\":\"88\",\"ListUrlMain\":\"https://doi.org/10.1038/s41524-024-01475-4\",\"RegionNum\":1,\"RegionCategory\":\"材料科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"CHEMISTRY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"npj Computational Materials","FirstCategoryId":"88","ListUrlMain":"https://doi.org/10.1038/s41524-024-01475-4","RegionNum":1,"RegionCategory":"材料科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"CHEMISTRY, PHYSICAL","Score":null,"Total":0}
Accelerating superconductor discovery through tempered deep learning of the electron-phonon spectral function
Integrating deep learning with the search for new electron-phonon superconductors represents a burgeoning field of research, where the primary challenge lies in the computational intensity of calculating the electron-phonon spectral function, α2F(ω), the essential ingredient of Midgal-Eliashberg theory of superconductivity. To overcome this challenge, we adopt a two-step approach. First, we compute α2F(ω) for 818 dynamically stable materials. We then train a deep-learning model to predict α2F(ω), using a training strategy tailored for limited data to temper the model’s overfitting, enhancing predictions. Specifically, we train a Bootstrapped Ensemble of Tempered Equivariant graph neural NETworks (BETE-NET), obtaining an MAE of 0.21, 45 K, and 43 K for the moments derived from α2F(ω): λ, \({\omega }_{\log }\), and ω2, respectively, yielding an MAE of 2.5 K for the critical temperature, Tc. Further, we incorporate domain knowledge of the site-projected phonon density of states to impose inductive bias into the model’s node attributes and enhance predictions. This methodological innovation decreases the MAE to 0.18, 29 K, and 28 K, respectively, yielding an MAE of 2.1 K for Tc. We illustrate the practical application of our model in high-throughput screening for high-Tc materials. The model demonstrates an average precision nearly five times higher than random screening, highlighting the potential of ML in accelerating superconductor discovery. BETE-NET accelerates the search for high-Tc superconductors while setting a precedent for applying ML in materials discovery, particularly when data is limited.
期刊介绍:
npj Computational Materials is a high-quality open access journal from Nature Research that publishes research papers applying computational approaches for the design of new materials and enhancing our understanding of existing ones. The journal also welcomes papers on new computational techniques and the refinement of current approaches that support these aims, as well as experimental papers that complement computational findings.
Some key features of npj Computational Materials include a 2-year impact factor of 12.241 (2021), article downloads of 1,138,590 (2021), and a fast turnaround time of 11 days from submission to the first editorial decision. The journal is indexed in various databases and services, including Chemical Abstracts Service (ACS), Astrophysics Data System (ADS), Current Contents/Physical, Chemical and Earth Sciences, Journal Citation Reports/Science Edition, SCOPUS, EI Compendex, INSPEC, Google Scholar, SCImago, DOAJ, CNKI, and Science Citation Index Expanded (SCIE), among others.