{"title":"基于残差学习的叠加集成深度随机向量函数链接网络中尺度时间序列预测","authors":"Ruobin Gao;Minghui Hu;Ruilin Li;Xuewen Luo;Ponnuthurai Nagaratnam Suganthan;M. Tanveer","doi":"10.1109/TNNLS.2025.3529219","DOIUrl":null,"url":null,"abstract":"The deep random vector functional link (dRVFL) and ensemble dRVFL (edRVFL) succeed in various tasks and achieve state-of-the-art performance compared with other randomized neural networks (NNs). However, existing edRVFL structures need more diversity and error correction ability in an independent network. Our work fills the gap by combining stacked deep blocks and residual learning with the edRVFL. Subsequently, we propose a novel dRVFL combined with residual learning, ResdRVFL, whose deep layers calibrate the wrong estimations from shallow layers. Additionally, we propose incorporating a scaling parameter to control the scaling of residuals from shallow layers, thus mitigating the risk of overfitting. Finally, we present an ensemble deep stacking network, SResdRVFL, based on ResdRVFL. SResdRVFL aggregates multiple blocks into a cohesive network, leveraging the benefits of deep learning and ensemble learning. We evaluate the proposed model on 28 datasets and compare it with the state-of-the-art methods. The comparative study demonstrates that the SResdRVFL is the best-performing approach in terms of average ranking and errors based on 28 datasets.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 6","pages":"10833-10843"},"PeriodicalIF":8.9000,"publicationDate":"2025-02-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10880477","citationCount":"0","resultStr":"{\"title\":\"Stacked Ensemble Deep Random Vector Functional Link Network With Residual Learning for Medium-Scale Time-Series Forecasting\",\"authors\":\"Ruobin Gao;Minghui Hu;Ruilin Li;Xuewen Luo;Ponnuthurai Nagaratnam Suganthan;M. Tanveer\",\"doi\":\"10.1109/TNNLS.2025.3529219\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The deep random vector functional link (dRVFL) and ensemble dRVFL (edRVFL) succeed in various tasks and achieve state-of-the-art performance compared with other randomized neural networks (NNs). However, existing edRVFL structures need more diversity and error correction ability in an independent network. Our work fills the gap by combining stacked deep blocks and residual learning with the edRVFL. Subsequently, we propose a novel dRVFL combined with residual learning, ResdRVFL, whose deep layers calibrate the wrong estimations from shallow layers. Additionally, we propose incorporating a scaling parameter to control the scaling of residuals from shallow layers, thus mitigating the risk of overfitting. Finally, we present an ensemble deep stacking network, SResdRVFL, based on ResdRVFL. SResdRVFL aggregates multiple blocks into a cohesive network, leveraging the benefits of deep learning and ensemble learning. We evaluate the proposed model on 28 datasets and compare it with the state-of-the-art methods. The comparative study demonstrates that the SResdRVFL is the best-performing approach in terms of average ranking and errors based on 28 datasets.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 6\",\"pages\":\"10833-10843\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-02-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10880477\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10880477/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10880477/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Stacked Ensemble Deep Random Vector Functional Link Network With Residual Learning for Medium-Scale Time-Series Forecasting
The deep random vector functional link (dRVFL) and ensemble dRVFL (edRVFL) succeed in various tasks and achieve state-of-the-art performance compared with other randomized neural networks (NNs). However, existing edRVFL structures need more diversity and error correction ability in an independent network. Our work fills the gap by combining stacked deep blocks and residual learning with the edRVFL. Subsequently, we propose a novel dRVFL combined with residual learning, ResdRVFL, whose deep layers calibrate the wrong estimations from shallow layers. Additionally, we propose incorporating a scaling parameter to control the scaling of residuals from shallow layers, thus mitigating the risk of overfitting. Finally, we present an ensemble deep stacking network, SResdRVFL, based on ResdRVFL. SResdRVFL aggregates multiple blocks into a cohesive network, leveraging the benefits of deep learning and ensemble learning. We evaluate the proposed model on 28 datasets and compare it with the state-of-the-art methods. The comparative study demonstrates that the SResdRVFL is the best-performing approach in terms of average ranking and errors based on 28 datasets.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.