Yuanhong Chen , Yifan Lin , Xiang Sun , Chunxin Yuan , Zhen Gao
{"title":"基于张量分解的动态模态分解神经算子用于参数化时相关问题","authors":"Yuanhong Chen , Yifan Lin , Xiang Sun , Chunxin Yuan , Zhen Gao","doi":"10.1016/j.jcp.2025.113996","DOIUrl":null,"url":null,"abstract":"<div><div>Deep operator networks (DeepONets), as a powerful tool to approximate nonlinear mappings between different function spaces, have gained significant attention recently for applications in modeling parameterized partial differential equations. However, limited by the poor extrapolation ability of purely data-driven neural operators, these models tend to fail in predicting solutions with high accuracy outside the training time interval. To address this issue, a novel operator learning framework, TDMD-DeepONet, is proposed in this work, based on tensor train decomposition (TTD) and dynamic mode decomposition (DMD). We first demonstrate the mathematical agreement of the representation of TTD and DeepONet. Then the TTD is performed on a higher-order tensor consisting of given spatial-temporal snapshots collected under a set of parameter values to generate the parameter-, space- and time-dependent cores. DMD is then utilized to model the evolution of the time-dependent core, which is combined with the space-dependent cores to represent the trunk net. Similar to DeepONet, the branch net employs a neural network, with the parameters as inputs and outputs merged with the trunk net for prediction. Furthermore, the feature-enhanced TDMD-DeepONet (ETDMD-DeepONet) is proposed to improve the accuracy, in which an additional linear layer is incorporated into the branch network compared with TDMD-DeepONet. The input to the linear layer is obtained by projecting the initial conditions onto the trunk network. The proposed methods' good performance is demonstrated through several classical examples, in which the results demonstrate that the new methods are more accurate in forecasting solutions than the standard DeepONet.</div></div>","PeriodicalId":352,"journal":{"name":"Journal of Computational Physics","volume":"533 ","pages":"Article 113996"},"PeriodicalIF":3.8000,"publicationDate":"2025-04-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Tensor decomposition-based neural operator with dynamic mode decomposition for parameterized time-dependent problems\",\"authors\":\"Yuanhong Chen , Yifan Lin , Xiang Sun , Chunxin Yuan , Zhen Gao\",\"doi\":\"10.1016/j.jcp.2025.113996\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Deep operator networks (DeepONets), as a powerful tool to approximate nonlinear mappings between different function spaces, have gained significant attention recently for applications in modeling parameterized partial differential equations. However, limited by the poor extrapolation ability of purely data-driven neural operators, these models tend to fail in predicting solutions with high accuracy outside the training time interval. To address this issue, a novel operator learning framework, TDMD-DeepONet, is proposed in this work, based on tensor train decomposition (TTD) and dynamic mode decomposition (DMD). We first demonstrate the mathematical agreement of the representation of TTD and DeepONet. Then the TTD is performed on a higher-order tensor consisting of given spatial-temporal snapshots collected under a set of parameter values to generate the parameter-, space- and time-dependent cores. DMD is then utilized to model the evolution of the time-dependent core, which is combined with the space-dependent cores to represent the trunk net. Similar to DeepONet, the branch net employs a neural network, with the parameters as inputs and outputs merged with the trunk net for prediction. Furthermore, the feature-enhanced TDMD-DeepONet (ETDMD-DeepONet) is proposed to improve the accuracy, in which an additional linear layer is incorporated into the branch network compared with TDMD-DeepONet. The input to the linear layer is obtained by projecting the initial conditions onto the trunk network. The proposed methods' good performance is demonstrated through several classical examples, in which the results demonstrate that the new methods are more accurate in forecasting solutions than the standard DeepONet.</div></div>\",\"PeriodicalId\":352,\"journal\":{\"name\":\"Journal of Computational Physics\",\"volume\":\"533 \",\"pages\":\"Article 113996\"},\"PeriodicalIF\":3.8000,\"publicationDate\":\"2025-04-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Computational Physics\",\"FirstCategoryId\":\"101\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0021999125002797\",\"RegionNum\":2,\"RegionCategory\":\"物理与天体物理\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Physics","FirstCategoryId":"101","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0021999125002797","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Tensor decomposition-based neural operator with dynamic mode decomposition for parameterized time-dependent problems
Deep operator networks (DeepONets), as a powerful tool to approximate nonlinear mappings between different function spaces, have gained significant attention recently for applications in modeling parameterized partial differential equations. However, limited by the poor extrapolation ability of purely data-driven neural operators, these models tend to fail in predicting solutions with high accuracy outside the training time interval. To address this issue, a novel operator learning framework, TDMD-DeepONet, is proposed in this work, based on tensor train decomposition (TTD) and dynamic mode decomposition (DMD). We first demonstrate the mathematical agreement of the representation of TTD and DeepONet. Then the TTD is performed on a higher-order tensor consisting of given spatial-temporal snapshots collected under a set of parameter values to generate the parameter-, space- and time-dependent cores. DMD is then utilized to model the evolution of the time-dependent core, which is combined with the space-dependent cores to represent the trunk net. Similar to DeepONet, the branch net employs a neural network, with the parameters as inputs and outputs merged with the trunk net for prediction. Furthermore, the feature-enhanced TDMD-DeepONet (ETDMD-DeepONet) is proposed to improve the accuracy, in which an additional linear layer is incorporated into the branch network compared with TDMD-DeepONet. The input to the linear layer is obtained by projecting the initial conditions onto the trunk network. The proposed methods' good performance is demonstrated through several classical examples, in which the results demonstrate that the new methods are more accurate in forecasting solutions than the standard DeepONet.
期刊介绍:
Journal of Computational Physics thoroughly treats the computational aspects of physical problems, presenting techniques for the numerical solution of mathematical equations arising in all areas of physics. The journal seeks to emphasize methods that cross disciplinary boundaries.
The Journal of Computational Physics also publishes short notes of 4 pages or less (including figures, tables, and references but excluding title pages). Letters to the Editor commenting on articles already published in this Journal will also be considered. Neither notes nor letters should have an abstract.