On a computational paradigm for a class of fractional order direct and inverse problems in terms of physics-informed neural networks with the attention mechanism

IF 3.1 3区 计算机科学 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
M. Srati , A. Oulmelk , L. Afraites , A. Hadri , M.A. Zaky , A.S. Hendy
{"title":"On a computational paradigm for a class of fractional order direct and inverse problems in terms of physics-informed neural networks with the attention mechanism","authors":"M. Srati ,&nbsp;A. Oulmelk ,&nbsp;L. Afraites ,&nbsp;A. Hadri ,&nbsp;M.A. Zaky ,&nbsp;A.S. Hendy","doi":"10.1016/j.jocs.2024.102514","DOIUrl":null,"url":null,"abstract":"<div><div>Physics-Informed Neural Networks (PINNs) have recently gained significant attention for their ability to solve both forward and inverse problems associated with linear and nonlinear fractional partial differential equations (PDEs). However, PINNs, relying on feedforward neural networks (FNNs), overlook the crucial temporal dependencies inherent in practical physics systems. As a result, they fail to globally propagate the initial condition constraints and accurately capture the true solutions under various scenarios. In contrast, the attention mechanism offers flexible means to implicitly exploit patterns within inputs and, moreover, establish relationships between arbitrary query locations and inputs. Thus, we present an attention-based framework for PINNs, which we term PINNs-Transformer (Zhao et al., 2023). The framework was constructed using self-attention and a set of point-wise multilayer perceptrons (MLPs). The novelty is in applying the framework to the various fractional differential equations with stiff dynamics as well as their inverse formulations. We have also validated the PINNs-Transformer on two examples: one involving a fractional diffusion differential equation over time, and the other focused on identifying a space-dependent parameter associated with the direct problem described in the first example. We reinforce this finding by conducting a numerical comparison with variant of PINN methods based on criteria such as relative error, complexity, memory needs and execution time.</div></div>","PeriodicalId":48907,"journal":{"name":"Journal of Computational Science","volume":"85 ","pages":"Article 102514"},"PeriodicalIF":3.1000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Science","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1877750324003077","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Physics-Informed Neural Networks (PINNs) have recently gained significant attention for their ability to solve both forward and inverse problems associated with linear and nonlinear fractional partial differential equations (PDEs). However, PINNs, relying on feedforward neural networks (FNNs), overlook the crucial temporal dependencies inherent in practical physics systems. As a result, they fail to globally propagate the initial condition constraints and accurately capture the true solutions under various scenarios. In contrast, the attention mechanism offers flexible means to implicitly exploit patterns within inputs and, moreover, establish relationships between arbitrary query locations and inputs. Thus, we present an attention-based framework for PINNs, which we term PINNs-Transformer (Zhao et al., 2023). The framework was constructed using self-attention and a set of point-wise multilayer perceptrons (MLPs). The novelty is in applying the framework to the various fractional differential equations with stiff dynamics as well as their inverse formulations. We have also validated the PINNs-Transformer on two examples: one involving a fractional diffusion differential equation over time, and the other focused on identifying a space-dependent parameter associated with the direct problem described in the first example. We reinforce this finding by conducting a numerical comparison with variant of PINN methods based on criteria such as relative error, complexity, memory needs and execution time.
一类分数阶正逆问题的物理信息神经网络计算范式与注意机制
物理信息神经网络(pinn)最近因其解决与线性和非线性分数阶偏微分方程(PDEs)相关的正解和逆问题的能力而获得了极大的关注。然而,依赖于前馈神经网络(fnn)的pin网络忽略了实际物理系统中固有的关键时间依赖性。因此,它们无法全局传播初始条件约束并准确捕获各种场景下的真实解。相比之下,注意机制提供了灵活的方法来隐式地利用输入中的模式,并且在任意查询位置和输入之间建立关系。因此,我们提出了一个基于注意力的pinn框架,我们称之为pinn - transformer (Zhao et al., 2023)。该框架使用自注意和一组点向多层感知器(mlp)构建。新颖之处在于将该框架应用于各种具有刚性动力学的分数阶微分方程及其逆公式。我们还在两个示例上验证了pass - transformer:一个示例涉及分数扩散微分方程随时间的变化,另一个示例侧重于识别与第一个示例中描述的直接问题相关的空间依赖参数。我们根据相对误差、复杂性、内存需求和执行时间等标准,对不同的PINN方法进行了数值比较,从而加强了这一发现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computational Science
Journal of Computational Science COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-COMPUTER SCIENCE, THEORY & METHODS
CiteScore
5.50
自引率
3.00%
发文量
227
审稿时长
41 days
期刊介绍: Computational Science is a rapidly growing multi- and interdisciplinary field that uses advanced computing and data analysis to understand and solve complex problems. It has reached a level of predictive capability that now firmly complements the traditional pillars of experimentation and theory. The recent advances in experimental techniques such as detectors, on-line sensor networks and high-resolution imaging techniques, have opened up new windows into physical and biological processes at many levels of detail. The resulting data explosion allows for detailed data driven modeling and simulation. This new discipline in science combines computational thinking, modern computational methods, devices and collateral technologies to address problems far beyond the scope of traditional numerical methods. Computational science typically unifies three distinct elements: • Modeling, Algorithms and Simulations (e.g. numerical and non-numerical, discrete and continuous); • Software developed to solve science (e.g., biological, physical, and social), engineering, medicine, and humanities problems; • Computer and information science that develops and optimizes the advanced system hardware, software, networking, and data management components (e.g. problem solving environments).
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信