分数阶梯度下降法及其收敛性分析

IF 5.6 1区 数学 Q1 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
Sroor M. Elnady , Mohamed El-Beltagy , Ahmed G. Radwan , Mohammed E. Fouda
{"title":"分数阶梯度下降法及其收敛性分析","authors":"Sroor M. Elnady ,&nbsp;Mohamed El-Beltagy ,&nbsp;Ahmed G. Radwan ,&nbsp;Mohammed E. Fouda","doi":"10.1016/j.chaos.2025.116154","DOIUrl":null,"url":null,"abstract":"<div><div>Fractional Gradient Descent (FGD) methods extend classical optimization algorithms by integrating fractional calculus, leading to notable improvements in convergence speed, stability, and accuracy. However, recent studies indicate that engineering challenges—such as tensor-based differentiation in deep neural networks—remain partially unresolved, prompting further investigation into the scalability and computational feasibility of FGD. This paper provides a comprehensive review of recent advancements in FGD techniques, focusing on their approximation methods and convergence properties. These methods are systematically categorized based on their strategies to overcome convergence challenges inherent in fractional-order calculations, such as non-locality and long-memory effects. Key techniques examined include modified fractional-order gradients designed to avoid singularities and ensure convergence to the true extremum. Adaptive step-size strategies and variable fractional-order schemes are analyzed, balancing rapid convergence with precise parameter estimation. Additionally, the application of truncation methods is explored to mitigate oscillatory behavior associated with fractional derivatives. By synthesizing convergence analyses from multiple studies, insights are offered into the theoretical foundations of these methods, including proofs of linear convergence. Ultimately, this paper highlights the effectiveness of various FGD approaches in accelerating convergence and enhancing stability. While also acknowledging significant gaps in practical implementations for large-scale engineering tasks, including deep learning. The presented review serves as a resource for researchers and practitioners in the selection of appropriate FGD techniques for different optimization problems.</div></div>","PeriodicalId":9764,"journal":{"name":"Chaos Solitons & Fractals","volume":"194 ","pages":"Article 116154"},"PeriodicalIF":5.6000,"publicationDate":"2025-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A comprehensive survey of fractional gradient descent methods and their convergence analysis\",\"authors\":\"Sroor M. Elnady ,&nbsp;Mohamed El-Beltagy ,&nbsp;Ahmed G. Radwan ,&nbsp;Mohammed E. Fouda\",\"doi\":\"10.1016/j.chaos.2025.116154\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Fractional Gradient Descent (FGD) methods extend classical optimization algorithms by integrating fractional calculus, leading to notable improvements in convergence speed, stability, and accuracy. However, recent studies indicate that engineering challenges—such as tensor-based differentiation in deep neural networks—remain partially unresolved, prompting further investigation into the scalability and computational feasibility of FGD. This paper provides a comprehensive review of recent advancements in FGD techniques, focusing on their approximation methods and convergence properties. These methods are systematically categorized based on their strategies to overcome convergence challenges inherent in fractional-order calculations, such as non-locality and long-memory effects. Key techniques examined include modified fractional-order gradients designed to avoid singularities and ensure convergence to the true extremum. Adaptive step-size strategies and variable fractional-order schemes are analyzed, balancing rapid convergence with precise parameter estimation. Additionally, the application of truncation methods is explored to mitigate oscillatory behavior associated with fractional derivatives. By synthesizing convergence analyses from multiple studies, insights are offered into the theoretical foundations of these methods, including proofs of linear convergence. Ultimately, this paper highlights the effectiveness of various FGD approaches in accelerating convergence and enhancing stability. While also acknowledging significant gaps in practical implementations for large-scale engineering tasks, including deep learning. The presented review serves as a resource for researchers and practitioners in the selection of appropriate FGD techniques for different optimization problems.</div></div>\",\"PeriodicalId\":9764,\"journal\":{\"name\":\"Chaos Solitons & Fractals\",\"volume\":\"194 \",\"pages\":\"Article 116154\"},\"PeriodicalIF\":5.6000,\"publicationDate\":\"2025-02-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos Solitons & Fractals\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0960077925001675\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos Solitons & Fractals","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0960077925001675","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

分数阶梯度下降(FGD)方法通过对分数阶微积分的整合,扩展了经典的优化算法,在收敛速度、稳定性和精度上有了显著的提高。然而,最近的研究表明,工程上的挑战,如深度神经网络中基于张量的微分,仍未得到部分解决,这促使人们进一步研究FGD的可扩展性和计算可行性。本文全面综述了烟气脱硫技术的最新进展,重点介绍了它们的近似方法和收敛性。这些方法根据其克服分数阶计算中固有的收敛挑战的策略进行了系统的分类,例如非局域性和长记忆效应。研究的关键技术包括改进的分数阶梯度,旨在避免奇点并确保收敛到真极值。分析了自适应步长策略和可变分数阶格式,以平衡快速收敛和精确参数估计。此外,还探讨了截断方法的应用,以减轻与分数阶导数相关的振荡行为。通过综合多个研究的收敛分析,对这些方法的理论基础,包括线性收敛的证明提供了见解。最后,本文强调了各种FGD方法在加速收敛和增强稳定性方面的有效性。同时也承认在包括深度学习在内的大规模工程任务的实际实施中存在重大差距。本文为研究人员和实践者在选择适合不同优化问题的烟气脱硫技术方面提供了参考。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A comprehensive survey of fractional gradient descent methods and their convergence analysis
Fractional Gradient Descent (FGD) methods extend classical optimization algorithms by integrating fractional calculus, leading to notable improvements in convergence speed, stability, and accuracy. However, recent studies indicate that engineering challenges—such as tensor-based differentiation in deep neural networks—remain partially unresolved, prompting further investigation into the scalability and computational feasibility of FGD. This paper provides a comprehensive review of recent advancements in FGD techniques, focusing on their approximation methods and convergence properties. These methods are systematically categorized based on their strategies to overcome convergence challenges inherent in fractional-order calculations, such as non-locality and long-memory effects. Key techniques examined include modified fractional-order gradients designed to avoid singularities and ensure convergence to the true extremum. Adaptive step-size strategies and variable fractional-order schemes are analyzed, balancing rapid convergence with precise parameter estimation. Additionally, the application of truncation methods is explored to mitigate oscillatory behavior associated with fractional derivatives. By synthesizing convergence analyses from multiple studies, insights are offered into the theoretical foundations of these methods, including proofs of linear convergence. Ultimately, this paper highlights the effectiveness of various FGD approaches in accelerating convergence and enhancing stability. While also acknowledging significant gaps in practical implementations for large-scale engineering tasks, including deep learning. The presented review serves as a resource for researchers and practitioners in the selection of appropriate FGD techniques for different optimization problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Chaos Solitons & Fractals
Chaos Solitons & Fractals 物理-数学跨学科应用
CiteScore
13.20
自引率
10.30%
发文量
1087
审稿时长
9 months
期刊介绍: Chaos, Solitons & Fractals strives to establish itself as a premier journal in the interdisciplinary realm of Nonlinear Science, Non-equilibrium, and Complex Phenomena. It welcomes submissions covering a broad spectrum of topics within this field, including dynamics, non-equilibrium processes in physics, chemistry, and geophysics, complex matter and networks, mathematical models, computational biology, applications to quantum and mesoscopic phenomena, fluctuations and random processes, self-organization, and social phenomena.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信