Weina Wang , Yongjie Wang , Xiaolong Qi , Hui Chen
{"title":"EIAformer: Empowering transformer with enhanced information acquisition for time series forecasting","authors":"Weina Wang , Yongjie Wang , Xiaolong Qi , Hui Chen","doi":"10.1016/j.neucom.2025.131700","DOIUrl":null,"url":null,"abstract":"<div><div>Transformer-based models have gained significant popularity and demonstrated remarkable performance in long-term time series forecasting. However, existing Transformer-based models are not designed to fully exploit the variation patterns and multiscale information of time series data. Moreover, there is a lack of channel strategy that effectively captures the essential connections between channels for improving the efficiency and accuracy of channel utilization. To overcome these problems, we propose a novel and adaptable architecture, EIAformer, to utilize comprehensive information to enhance the prediction performance. Firstly, hybrid decomposition is proposed to perform different operations on data with different variation patterns using a divide-and-conquer strategy. Then, dynamic patching based on dilated causal convolution is designed to capture multiscale information. Finally, channel fusion based on Granger Causality and DTW distance is constructed to capture the correlation between different channels, and the merged channels are fed into the encoder to perform prediction. Extensive experiments on nine datasets demonstrate that EIAformer achieves superior performance compared to existing Transformer-based models. Meanwhile, the proposed enhancement module as a plug-and-play solution can boost the performance and efficiency of the Transformer family models.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"658 ","pages":"Article 131700"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225023720","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Transformer-based models have gained significant popularity and demonstrated remarkable performance in long-term time series forecasting. However, existing Transformer-based models are not designed to fully exploit the variation patterns and multiscale information of time series data. Moreover, there is a lack of channel strategy that effectively captures the essential connections between channels for improving the efficiency and accuracy of channel utilization. To overcome these problems, we propose a novel and adaptable architecture, EIAformer, to utilize comprehensive information to enhance the prediction performance. Firstly, hybrid decomposition is proposed to perform different operations on data with different variation patterns using a divide-and-conquer strategy. Then, dynamic patching based on dilated causal convolution is designed to capture multiscale information. Finally, channel fusion based on Granger Causality and DTW distance is constructed to capture the correlation between different channels, and the merged channels are fed into the encoder to perform prediction. Extensive experiments on nine datasets demonstrate that EIAformer achieves superior performance compared to existing Transformer-based models. Meanwhile, the proposed enhancement module as a plug-and-play solution can boost the performance and efficiency of the Transformer family models.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.