Dan Zhang , Tong Zhang , Zhang Tao , C.L. Philip Chen
{"title":"Broad learning system based on fractional order optimization","authors":"Dan Zhang , Tong Zhang , Zhang Tao , C.L. Philip Chen","doi":"10.1016/j.neunet.2025.107468","DOIUrl":null,"url":null,"abstract":"<div><div>Due to its efficient incremental learning performance, the broad learning system (BLS) has received widespread attention in the field of machine learning. Scholars have found in algorithm research that using the maximum correntropy criterion (MCC) can further improves the performance of broad learning in handling outliers. Recent studies have shown that differential equations can be used to represent the forward propagation of deep learning. The BLS based on MCC uses differentiation to optimize parameters, which indicates that differential methods can also be used for BLS optimization. But general methods use integer order differential equations, ignoring system information between integer orders. Due to the long-term memory property of fractional differential equations, this paper innovatively introduces fractional order optimization into the BLS, called FOBLS, to better enhance the data processing capability of the BLS. Firstly, a BLS is constructed using fractional order, incorporating long-term memory characteristics into the weight optimization process. In addition, constructing a dynamic incremental learning system based on fractional order further enhances the ability of network optimization. The experimental results demonstrate the excellent performance of the method proposed in this paper.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"188 ","pages":"Article 107468"},"PeriodicalIF":6.0000,"publicationDate":"2025-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025003478","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Due to its efficient incremental learning performance, the broad learning system (BLS) has received widespread attention in the field of machine learning. Scholars have found in algorithm research that using the maximum correntropy criterion (MCC) can further improves the performance of broad learning in handling outliers. Recent studies have shown that differential equations can be used to represent the forward propagation of deep learning. The BLS based on MCC uses differentiation to optimize parameters, which indicates that differential methods can also be used for BLS optimization. But general methods use integer order differential equations, ignoring system information between integer orders. Due to the long-term memory property of fractional differential equations, this paper innovatively introduces fractional order optimization into the BLS, called FOBLS, to better enhance the data processing capability of the BLS. Firstly, a BLS is constructed using fractional order, incorporating long-term memory characteristics into the weight optimization process. In addition, constructing a dynamic incremental learning system based on fractional order further enhances the ability of network optimization. The experimental results demonstrate the excellent performance of the method proposed in this paper.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.