{"title":"基于Caputo分数阶差分自适应调整的卷积神经网络参数训练方法","authors":"Haiming Zhao , Honggang Yang , Jiejie Chen , Ping Jiang , Zhigang Zeng","doi":"10.1016/j.chaos.2025.116588","DOIUrl":null,"url":null,"abstract":"<div><div>As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.</div></div>","PeriodicalId":9764,"journal":{"name":"Chaos Solitons & Fractals","volume":"199 ","pages":"Article 116588"},"PeriodicalIF":5.6000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Parameter training methods for convolutional neural networks with adaptive adjustment method based on Caputo fractional-order differences\",\"authors\":\"Haiming Zhao , Honggang Yang , Jiejie Chen , Ping Jiang , Zhigang Zeng\",\"doi\":\"10.1016/j.chaos.2025.116588\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.</div></div>\",\"PeriodicalId\":9764,\"journal\":{\"name\":\"Chaos Solitons & Fractals\",\"volume\":\"199 \",\"pages\":\"Article 116588\"},\"PeriodicalIF\":5.6000,\"publicationDate\":\"2025-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Chaos Solitons & Fractals\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0960077925006010\",\"RegionNum\":1,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Chaos Solitons & Fractals","FirstCategoryId":"100","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0960077925006010","RegionNum":1,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
Parameter training methods for convolutional neural networks with adaptive adjustment method based on Caputo fractional-order differences
As deep learning technologies continue to permeate various sectors, optimization algorithms have become increasingly crucial in neural network training. This paper introduces two adaptive momentum algorithms based on Grünwald–Letnikov and Caputo fractional-order differences—Fractional Order Adagrad (FAdagrad) and Fractional Order Adam (FAdam)—to update parameters more flexibly by adjusting momentum information. Commencing from the definitions of fractional derivatives, we propose integrating fractional-order differences with gradient algorithms in convolutional neural networks (CNNs). These adaptive momentum algorithms, leveraging Grünwald–Letnikov and Caputo fractional-order differences, offer enhanced flexibility, thereby accelerating convergence. Our nonlinear parameter tuning method for CNNs demonstrates superior performance compared to traditional integer-order momentum algorithms and the standard Adam algorithm. Experimental results on the BraTS2021 dataset and CIFAR-100 dataset reveal that the proposed fractional-order optimization algorithms significantly outperform their integer-order counterparts in model optimization. They not only expedite convergence but also improve the accuracy of image recognition and segmentation.
期刊介绍:
Chaos, Solitons & Fractals strives to establish itself as a premier journal in the interdisciplinary realm of Nonlinear Science, Non-equilibrium, and Complex Phenomena. It welcomes submissions covering a broad spectrum of topics within this field, including dynamics, non-equilibrium processes in physics, chemistry, and geophysics, complex matter and networks, mathematical models, computational biology, applications to quantum and mesoscopic phenomena, fluctuations and random processes, self-organization, and social phenomena.