Ming-Yan Sun, Peng Xu, Jun-Jie Zhang, Tai-Jiao Du, Jian-Guo Wang
{"title":"JefiAtten: an attention-based neural network model for solving Maxwell’s equations with charge and current sources","authors":"Ming-Yan Sun, Peng Xu, Jun-Jie Zhang, Tai-Jiao Du, Jian-Guo Wang","doi":"10.1088/2632-2153/ad6ee9","DOIUrl":null,"url":null,"abstract":"We present JefiAtten, a novel neural network model employing the attention mechanism to solve Maxwell’s equations efficiently. JefiAtten uses self-attention and cross-attention modules to understand the interplay between charge density, current density, and electromagnetic fields. Our results indicate that JefiAtten can generalize well to a range of scenarios, maintaining accuracy across various spatial distribution and handling amplitude variations. The model showcases an improvement in computation speed after training, compared to traditional integral methods. The adaptability of the model suggests potential for broader applications in computational physics, with further refinements to enhance its predictive capabilities and computational efficiency. Our work is a testament to the efficacy of integrating attention mechanisms with numerical simulations, marking a step forward in the quest for data-driven solutions to physical phenomena.","PeriodicalId":33757,"journal":{"name":"Machine Learning Science and Technology","volume":"30 1","pages":""},"PeriodicalIF":6.3000,"publicationDate":"2024-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine Learning Science and Technology","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1088/2632-2153/ad6ee9","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
We present JefiAtten, a novel neural network model employing the attention mechanism to solve Maxwell’s equations efficiently. JefiAtten uses self-attention and cross-attention modules to understand the interplay between charge density, current density, and electromagnetic fields. Our results indicate that JefiAtten can generalize well to a range of scenarios, maintaining accuracy across various spatial distribution and handling amplitude variations. The model showcases an improvement in computation speed after training, compared to traditional integral methods. The adaptability of the model suggests potential for broader applications in computational physics, with further refinements to enhance its predictive capabilities and computational efficiency. Our work is a testament to the efficacy of integrating attention mechanisms with numerical simulations, marking a step forward in the quest for data-driven solutions to physical phenomena.
期刊介绍:
Machine Learning Science and Technology is a multidisciplinary open access journal that bridges the application of machine learning across the sciences with advances in machine learning methods and theory as motivated by physical insights. Specifically, articles must fall into one of the following categories: advance the state of machine learning-driven applications in the sciences or make conceptual, methodological or theoretical advances in machine learning with applications to, inspiration from, or motivated by scientific problems.