Xinsheng Yang;Zining Wang;Lingyue Wang;Rentian Zhang;Guizhi Xu;Qingxin Yang
{"title":"An Optimized Neural Network for Efficient Resource Utilization and Enhanced Accuracy in Magnetic Field Prediction","authors":"Xinsheng Yang;Zining Wang;Lingyue Wang;Rentian Zhang;Guizhi Xu;Qingxin Yang","doi":"10.1109/TAI.2024.3462301","DOIUrl":null,"url":null,"abstract":"The article presents a deep learning approach which enables numerical calculation of magnetic fields in various electromagnetic devices. In comparison to the finite element analysis (FEA) method, the trained model demonstrates a significantly faster computation speed. The accuracy of representing information within the solution domain is enhanced through the use of a bitmap technique. A shifted window-based self-attention (SW-MSA) mechanism is employed to analyze device information within the solution domain. Considering the nonnegativity property of magnetic flux density, the Softplus activation function is incorporated into the neural network model, resulting in the proposed Softplus-Enhanced Swin-Unet (SESU). Moreover, magnetic field prediction is conducted for three types of electromagnetic devices: coils, transformers, and motors. Compared with the commonly used convolutional neural network (CNN) and vision transformer (ViT) models, this approach achieves a minimum of 10-fold improvement in prediction accuracy while reducing computational resource consumption by 35%. The proposed method is validated through FEA and comparative experiments.","PeriodicalId":73305,"journal":{"name":"IEEE transactions on artificial intelligence","volume":"7 5","pages":"3006-3017"},"PeriodicalIF":0.0000,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on artificial intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10684276/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/9/19 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The article presents a deep learning approach which enables numerical calculation of magnetic fields in various electromagnetic devices. In comparison to the finite element analysis (FEA) method, the trained model demonstrates a significantly faster computation speed. The accuracy of representing information within the solution domain is enhanced through the use of a bitmap technique. A shifted window-based self-attention (SW-MSA) mechanism is employed to analyze device information within the solution domain. Considering the nonnegativity property of magnetic flux density, the Softplus activation function is incorporated into the neural network model, resulting in the proposed Softplus-Enhanced Swin-Unet (SESU). Moreover, magnetic field prediction is conducted for three types of electromagnetic devices: coils, transformers, and motors. Compared with the commonly used convolutional neural network (CNN) and vision transformer (ViT) models, this approach achieves a minimum of 10-fold improvement in prediction accuracy while reducing computational resource consumption by 35%. The proposed method is validated through FEA and comparative experiments.