Chanwook Park,Sourav Saha,Jiachen Guo,Hantao Zhang,Xiaoyu Xie,Miguel A Bessa,Dong Qian,Wei Chen,Gregory J Wanger,Jian Cao,Thomas J R Hughes,Wing Kam Liu
{"title":"通过插值神经网络统一机器学习和插值理论。","authors":"Chanwook Park,Sourav Saha,Jiachen Guo,Hantao Zhang,Xiaoyu Xie,Miguel A Bessa,Dong Qian,Wei Chen,Gregory J Wanger,Jian Cao,Thomas J R Hughes,Wing Kam Liu","doi":"10.1038/s41467-025-63790-8","DOIUrl":null,"url":null,"abstract":"Computational science and engineering are shifting toward data-centric, optimization-based, and self-correcting solvers with artificial intelligence. This transition faces challenges such as low accuracy with sparse data, poor scalability, and high computational cost in complex system design. This work introduces Interpolating Neural Network (INN)-a network architecture blending interpolation theory and tensor decomposition. INN significantly reduces computational effort and memory requirements while maintaining high accuracy. Thus, it outperforms traditional partial differential equation (PDE) solvers, machine learning (ML) models, and physics-informed neural networks (PINNs). It also efficiently handles sparse data and enables dynamic updates of nonlinear activation. Demonstrated in metal additive manufacturing, INN rapidly constructs an accurate surrogate model of Laser Powder Bed Fusion (L-PBF) heat transfer simulation. It achieves sub-10-micrometer resolution for a 10 mm path in under 15 minutes on a single GPU, which is 5-8 orders of magnitude faster than competing ML models. This offers a new perspective for addressing challenges in computational science and engineering.","PeriodicalId":19066,"journal":{"name":"Nature Communications","volume":"76 1","pages":"8753"},"PeriodicalIF":15.7000,"publicationDate":"2025-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Unifying machine learning and interpolation theory via interpolating neural networks.\",\"authors\":\"Chanwook Park,Sourav Saha,Jiachen Guo,Hantao Zhang,Xiaoyu Xie,Miguel A Bessa,Dong Qian,Wei Chen,Gregory J Wanger,Jian Cao,Thomas J R Hughes,Wing Kam Liu\",\"doi\":\"10.1038/s41467-025-63790-8\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Computational science and engineering are shifting toward data-centric, optimization-based, and self-correcting solvers with artificial intelligence. This transition faces challenges such as low accuracy with sparse data, poor scalability, and high computational cost in complex system design. This work introduces Interpolating Neural Network (INN)-a network architecture blending interpolation theory and tensor decomposition. INN significantly reduces computational effort and memory requirements while maintaining high accuracy. Thus, it outperforms traditional partial differential equation (PDE) solvers, machine learning (ML) models, and physics-informed neural networks (PINNs). It also efficiently handles sparse data and enables dynamic updates of nonlinear activation. Demonstrated in metal additive manufacturing, INN rapidly constructs an accurate surrogate model of Laser Powder Bed Fusion (L-PBF) heat transfer simulation. It achieves sub-10-micrometer resolution for a 10 mm path in under 15 minutes on a single GPU, which is 5-8 orders of magnitude faster than competing ML models. This offers a new perspective for addressing challenges in computational science and engineering.\",\"PeriodicalId\":19066,\"journal\":{\"name\":\"Nature Communications\",\"volume\":\"76 1\",\"pages\":\"8753\"},\"PeriodicalIF\":15.7000,\"publicationDate\":\"2025-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Nature Communications\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1038/s41467-025-63790-8\",\"RegionNum\":1,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Nature Communications","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1038/s41467-025-63790-8","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
Unifying machine learning and interpolation theory via interpolating neural networks.
Computational science and engineering are shifting toward data-centric, optimization-based, and self-correcting solvers with artificial intelligence. This transition faces challenges such as low accuracy with sparse data, poor scalability, and high computational cost in complex system design. This work introduces Interpolating Neural Network (INN)-a network architecture blending interpolation theory and tensor decomposition. INN significantly reduces computational effort and memory requirements while maintaining high accuracy. Thus, it outperforms traditional partial differential equation (PDE) solvers, machine learning (ML) models, and physics-informed neural networks (PINNs). It also efficiently handles sparse data and enables dynamic updates of nonlinear activation. Demonstrated in metal additive manufacturing, INN rapidly constructs an accurate surrogate model of Laser Powder Bed Fusion (L-PBF) heat transfer simulation. It achieves sub-10-micrometer resolution for a 10 mm path in under 15 minutes on a single GPU, which is 5-8 orders of magnitude faster than competing ML models. This offers a new perspective for addressing challenges in computational science and engineering.
期刊介绍:
Nature Communications, an open-access journal, publishes high-quality research spanning all areas of the natural sciences. Papers featured in the journal showcase significant advances relevant to specialists in each respective field. With a 2-year impact factor of 16.6 (2022) and a median time of 8 days from submission to the first editorial decision, Nature Communications is committed to rapid dissemination of research findings. As a multidisciplinary journal, it welcomes contributions from biological, health, physical, chemical, Earth, social, mathematical, applied, and engineering sciences, aiming to highlight important breakthroughs within each domain.