{"title":"Fast and generalizable micromagnetic simulation with deep neural nets","authors":"Yunqi Cai, Jiangnan Li, Dong Wang","doi":"10.1038/s42256-024-00914-7","DOIUrl":null,"url":null,"abstract":"Important progress has been made in micromagnetics, driven by its wide-ranging applications in magnetic storage design. Numerical simulation, a cornerstone of micromagnetics research, relies on first-principles rules to compute the dynamic evolution of micromagnetic systems using the renowned Landau–Lifshitz–Gilbert equation, named after Landau, Lifshitz and Gilbert. However, these simulations are often hindered by their slow speeds. Although fast Fourier transformation calculations reduce the computational complexity to O(Nlog(N)), it remains impractical for large-scale simulations. Here we introduce NeuralMAG, a deep learning approach to micromagnetic simulation. Our approach follows the Landau–Lifshitz–Gilbert iterative framework but accelerates computation of demagnetizing fields by employing a U-shaped neural network. This neural network architecture comprises an encoder that extracts aggregated spins at various scales and learns the local interaction at each scale, followed by a decoder that accumulates the local interactions at different scales to approximate the global convolution. This divide-and-accumulate scheme achieves a time complexity of O(N), notably enhancing the speed and feasibility of large-scale simulations. Unlike existing neural methods, NeuralMAG concentrates on the core computation—rather than an end-to-end approximation for a specific task—making it inherently generalizable. To validate the new approach, we trained a single model and evaluated it on two micromagnetics tasks with various sample sizes, shapes and material settings. Many physical systems involve long-range interactions, which present a considerable obstacle to large-scale simulations. Cai, Li and Wang introduce NeuralMAG, a deep learning approach to reduce complexity and accelerate micromagnetic simulations.","PeriodicalId":48533,"journal":{"name":"Nature Machine Intelligence","volume":"6 11","pages":"1330-1343"},"PeriodicalIF":18.8000,"publicationDate":"2024-11-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Machine Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.nature.com/articles/s42256-024-00914-7","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Important progress has been made in micromagnetics, driven by its wide-ranging applications in magnetic storage design. Numerical simulation, a cornerstone of micromagnetics research, relies on first-principles rules to compute the dynamic evolution of micromagnetic systems using the renowned Landau–Lifshitz–Gilbert equation, named after Landau, Lifshitz and Gilbert. However, these simulations are often hindered by their slow speeds. Although fast Fourier transformation calculations reduce the computational complexity to O(Nlog(N)), it remains impractical for large-scale simulations. Here we introduce NeuralMAG, a deep learning approach to micromagnetic simulation. Our approach follows the Landau–Lifshitz–Gilbert iterative framework but accelerates computation of demagnetizing fields by employing a U-shaped neural network. This neural network architecture comprises an encoder that extracts aggregated spins at various scales and learns the local interaction at each scale, followed by a decoder that accumulates the local interactions at different scales to approximate the global convolution. This divide-and-accumulate scheme achieves a time complexity of O(N), notably enhancing the speed and feasibility of large-scale simulations. Unlike existing neural methods, NeuralMAG concentrates on the core computation—rather than an end-to-end approximation for a specific task—making it inherently generalizable. To validate the new approach, we trained a single model and evaluated it on two micromagnetics tasks with various sample sizes, shapes and material settings. Many physical systems involve long-range interactions, which present a considerable obstacle to large-scale simulations. Cai, Li and Wang introduce NeuralMAG, a deep learning approach to reduce complexity and accelerate micromagnetic simulations.
期刊介绍:
Nature Machine Intelligence is a distinguished publication that presents original research and reviews on various topics in machine learning, robotics, and AI. Our focus extends beyond these fields, exploring their profound impact on other scientific disciplines, as well as societal and industrial aspects. We recognize limitless possibilities wherein machine intelligence can augment human capabilities and knowledge in domains like scientific exploration, healthcare, medical diagnostics, and the creation of safe and sustainable cities, transportation, and agriculture. Simultaneously, we acknowledge the emergence of ethical, social, and legal concerns due to the rapid pace of advancements.
To foster interdisciplinary discussions on these far-reaching implications, Nature Machine Intelligence serves as a platform for dialogue facilitated through Comments, News Features, News & Views articles, and Correspondence. Our goal is to encourage a comprehensive examination of these subjects.
Similar to all Nature-branded journals, Nature Machine Intelligence operates under the guidance of a team of skilled editors. We adhere to a fair and rigorous peer-review process, ensuring high standards of copy-editing and production, swift publication, and editorial independence.