基于Caputo分数阶差分自适应调整的卷积神经网络参数训练方法

IF 5.6 1区 数学 Q1 MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
Haiming Zhao , Honggang Yang , Jiejie Chen , Ping Jiang , Zhigang Zeng
{"title":"基于Caputo分数阶差分自适应调整的卷积神经网络参数训练方法","authors":"Haiming Zhao ,&nbsp;Honggang Yang ,&nbsp;Jiejie Chen ,&nbsp;Ping Jiang ,&nbsp;Zhigang Zeng","doi":"10.1016/j.chaos.2025.116588","DOIUrl":null,"url":null,"abstract":"<div><div>As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.</div></div>","PeriodicalId":9764,"journal":{"name":"Chaos Solitons & Fractals","volume":"199 ","pages":"Article 116588"},"PeriodicalIF":5.6000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Parameter training methods for convolutional neural networks with adaptive adjustment method based on Caputo fractional-order differences\",\"authors\":\"Haiming Zhao ,&nbsp;Honggang Yang ,&nbsp;Jiejie Chen ,&nbsp;Ping Jiang ,&nbsp;Zhigang Zeng\",\"doi\":\"10.1016/j.chaos.2025.116588\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.</div></div>\",\"PeriodicalId\":9764,\"journal\":{\"name\":\"Chaos Solitons & Fractals\",\"volume\":\"199 \",\"pages\":\"Article 116588\"},\"PeriodicalIF\":5.6000,\"publicationDate\":\"2025-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos Solitons & Fractals\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0960077925006010\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos Solitons & Fractals","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0960077925006010","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

摘要

随着深度学习技术不断渗透到各个领域,优化算法在神经网络训练中变得越来越重要。本文介绍了基于gr nwald - letnikov和Caputo分数阶差分的两种自适应动量算法——分数阶Adagrad (FAdagrad)和分数阶Adam (FAdam),通过调整动量信息更灵活地更新参数。从分数阶导数的定义出发,提出了卷积神经网络(cnn)中分数阶差分与梯度算法的积分。这些自适应动量算法,利用格恩瓦尔德-列特尼科夫和卡普托分数阶差分,提供了增强的灵活性,从而加速收敛。与传统的整阶动量算法和标准Adam算法相比,本文提出的cnn非线性参数整定方法具有更好的性能。在BraTS2021数据集和CIFAR-100数据集上的实验结果表明,分数阶优化算法在模型优化方面明显优于整数阶优化算法。它们不仅加快了收敛速度,而且提高了图像识别和分割的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Parameter training methods for convolutional neural networks with adaptive adjustment method based on Caputo fractional-order differences
As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Chaos Solitons & Fractals
Chaos Solitons & Fractals 物理-数学跨学科应用
CiteScore
13.20
自引率
10.30%
发文量
1087
审稿时长
9 months
期刊介绍: Chaos, Solitons & Fractals strives to establish itself as a premier journal in the interdisciplinary realm of Nonlinear Science, Non-equilibrium, and Complex Phenomena. It welcomes submissions covering a broad spectrum of topics within this field, including dynamics, non-equilibrium processes in physics, chemistry, and geophysics, complex matter and networks, mathematical models, computational biology, applications to quantum and mesoscopic phenomena, fluctuations and random processes, self-organization, and social phenomena.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信