{"title":"DiffGrad for Physics-Informed Neural Networks","authors":"Jamshaid Ul Rahman, Nimra","doi":"arxiv-2409.03239","DOIUrl":null,"url":null,"abstract":"Physics-Informed Neural Networks (PINNs) are regarded as state-of-the-art\ntools for addressing highly nonlinear problems based on partial differential\nequations. Despite their broad range of applications, PINNs encounter several\nperformance challenges, including issues related to efficiency, minimization of\ncomputational cost, and enhancement of accuracy. Burgers' equation, a\nfundamental equation in fluid dynamics that is extensively used in PINNs,\nprovides flexible results with the Adam optimizer that does not account for\npast gradients. This paper introduces a novel strategy for solving Burgers'\nequation by incorporating DiffGrad with PINNs, a method that leverages the\ndifference between current and immediately preceding gradients to enhance\nperformance. A comprehensive computational analysis is conducted using\noptimizers such as Adam, Adamax, RMSprop, and DiffGrad to evaluate and compare\ntheir effectiveness. Our approach includes visualizing the solutions over space\nat various time intervals to demonstrate the accuracy of the network. The\nresults show that DiffGrad not only improves the accuracy of the solution but\nalso reduces training time compared to the other optimizers.","PeriodicalId":501369,"journal":{"name":"arXiv - PHYS - Computational Physics","volume":"124 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Computational Physics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.03239","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Physics-Informed Neural Networks (PINNs) are regarded as state-of-the-art
tools for addressing highly nonlinear problems based on partial differential
equations. Despite their broad range of applications, PINNs encounter several
performance challenges, including issues related to efficiency, minimization of
computational cost, and enhancement of accuracy. Burgers' equation, a
fundamental equation in fluid dynamics that is extensively used in PINNs,
provides flexible results with the Adam optimizer that does not account for
past gradients. This paper introduces a novel strategy for solving Burgers'
equation by incorporating DiffGrad with PINNs, a method that leverages the
difference between current and immediately preceding gradients to enhance
performance. A comprehensive computational analysis is conducted using
optimizers such as Adam, Adamax, RMSprop, and DiffGrad to evaluate and compare
their effectiveness. Our approach includes visualizing the solutions over space
at various time intervals to demonstrate the accuracy of the network. The
results show that DiffGrad not only improves the accuracy of the solution but
also reduces training time compared to the other optimizers.