利用混合激活函数优化物理信息神经网络:利用偏微分方程提高残差损失和精度的比较研究

IF 5.3 1区 数学 Q1 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
Husna Zafar , Ahmad , Xiangyang Liu , Muhammad Noveel Sadiq
{"title":"利用混合激活函数优化物理信息神经网络:利用偏微分方程提高残差损失和精度的比较研究","authors":"Husna Zafar ,&nbsp;Ahmad ,&nbsp;Xiangyang Liu ,&nbsp;Muhammad Noveel Sadiq","doi":"10.1016/j.chaos.2024.115727","DOIUrl":null,"url":null,"abstract":"<div><div>Physics-informed neural networks have bridged the gap between traditional numerical and deep learning based approaches in scientific computing. However, they still face limitations regarding improving convergence, accuracy, and minimizing residual loss, where the activation function plays a crucial role. Traditional activation functions often undergo vanishing gradient problems during backpropagation, highlighting the need for better alternatives for efficient training of Physics Informed Neural Networks. In this paper, new hybrid activation functions were proposed which combine the salient characteristics of traditional activation functions. These activation functions were tested with different network hyperparameters on the Swift–Hohenberg equation, a leading tool for modeling pattern development and evolution in fields like thermal convection, fluid, and temperature dynamics, as well as the Burgers equation. Manual tuning of hyperparameters is employed to critically assess the behavior of new activation functions in different experimental settings. Results show that hybrid activation functions have better learning capabilities compared to traditional activation functions. The GaussSwish hybrid activation function, in particular, proved to be highly effective across different network settings, showing better learning ability in training models for complex problems. This research also reveals that not only activation function but residual points sampled through different Monte Carlo sequences also influence the performance of Physics Informed Neural Networks.</div></div>","PeriodicalId":9764,"journal":{"name":"Chaos Solitons & Fractals","volume":"191 ","pages":"Article 115727"},"PeriodicalIF":5.3000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Optimizing Physics-Informed Neural Networks with hybrid activation functions: A comparative study on improving residual loss and accuracy using partial differential equations\",\"authors\":\"Husna Zafar ,&nbsp;Ahmad ,&nbsp;Xiangyang Liu ,&nbsp;Muhammad Noveel Sadiq\",\"doi\":\"10.1016/j.chaos.2024.115727\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Physics-informed neural networks have bridged the gap between traditional numerical and deep learning based approaches in scientific computing. However, they still face limitations regarding improving convergence, accuracy, and minimizing residual loss, where the activation function plays a crucial role. Traditional activation functions often undergo vanishing gradient problems during backpropagation, highlighting the need for better alternatives for efficient training of Physics Informed Neural Networks. In this paper, new hybrid activation functions were proposed which combine the salient characteristics of traditional activation functions. These activation functions were tested with different network hyperparameters on the Swift–Hohenberg equation, a leading tool for modeling pattern development and evolution in fields like thermal convection, fluid, and temperature dynamics, as well as the Burgers equation. Manual tuning of hyperparameters is employed to critically assess the behavior of new activation functions in different experimental settings. Results show that hybrid activation functions have better learning capabilities compared to traditional activation functions. The GaussSwish hybrid activation function, in particular, proved to be highly effective across different network settings, showing better learning ability in training models for complex problems. This research also reveals that not only activation function but residual points sampled through different Monte Carlo sequences also influence the performance of Physics Informed Neural Networks.</div></div>\",\"PeriodicalId\":9764,\"journal\":{\"name\":\"Chaos Solitons & Fractals\",\"volume\":\"191 \",\"pages\":\"Article 115727\"},\"PeriodicalIF\":5.3000,\"publicationDate\":\"2025-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos Solitons & Fractals\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0960077924012797\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos Solitons & Fractals","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0960077924012797","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

基于物理的神经网络在科学计算中弥合了传统数值和基于深度学习的方法之间的差距。然而,在提高收敛性、准确性和最小化残差损失方面,它们仍然面临局限性,其中激活函数起着至关重要的作用。传统的激活函数在反向传播过程中经常出现梯度消失问题,因此需要更好的替代方法来有效地训练物理信息神经网络。本文结合传统激活函数的显著特点,提出了一种新的混合激活函数。这些激活函数在Swift-Hohenberg方程(热对流、流体和温度动力学等领域模式发展和演化的主要建模工具)以及Burgers方程上用不同的网络超参数进行了测试。采用超参数的手动调谐来严格评估新激活函数在不同实验设置中的行为。结果表明,与传统激活函数相比,混合激活函数具有更好的学习能力。特别是GaussSwish混合激活函数,在不同的网络设置中被证明是非常有效的,在复杂问题的训练模型中表现出更好的学习能力。研究还表明,除了激活函数,通过不同的蒙特卡罗序列采样的残差点也会影响物理信息神经网络的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Optimizing Physics-Informed Neural Networks with hybrid activation functions: A comparative study on improving residual loss and accuracy using partial differential equations
Physics-informed neural networks have bridged the gap between traditional numerical and deep learning based approaches in scientific computing. However, they still face limitations regarding improving convergence, accuracy, and minimizing residual loss, where the activation function plays a crucial role. Traditional activation functions often undergo vanishing gradient problems during backpropagation, highlighting the need for better alternatives for efficient training of Physics Informed Neural Networks. In this paper, new hybrid activation functions were proposed which combine the salient characteristics of traditional activation functions. These activation functions were tested with different network hyperparameters on the Swift–Hohenberg equation, a leading tool for modeling pattern development and evolution in fields like thermal convection, fluid, and temperature dynamics, as well as the Burgers equation. Manual tuning of hyperparameters is employed to critically assess the behavior of new activation functions in different experimental settings. Results show that hybrid activation functions have better learning capabilities compared to traditional activation functions. The GaussSwish hybrid activation function, in particular, proved to be highly effective across different network settings, showing better learning ability in training models for complex problems. This research also reveals that not only activation function but residual points sampled through different Monte Carlo sequences also influence the performance of Physics Informed Neural Networks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Chaos Solitons & Fractals
Chaos Solitons & Fractals 物理-数学跨学科应用
CiteScore
13.20
自引率
10.30%
发文量
1087
审稿时长
9 months
期刊介绍: Chaos, Solitons & Fractals strives to establish itself as a premier journal in the interdisciplinary realm of Nonlinear Science, Non-equilibrium, and Complex Phenomena. It welcomes submissions covering a broad spectrum of topics within this field, including dynamics, non-equilibrium processes in physics, chemistry, and geophysics, complex matter and networks, mathematical models, computational biology, applications to quantum and mesoscopic phenomena, fluctuations and random processes, self-organization, and social phenomena.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信