EIAformer:增强变压器的信息获取能力,用于时间序列预测

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Weina Wang , Yongjie Wang , Xiaolong Qi , Hui Chen
{"title":"EIAformer:增强变压器的信息获取能力,用于时间序列预测","authors":"Weina Wang ,&nbsp;Yongjie Wang ,&nbsp;Xiaolong Qi ,&nbsp;Hui Chen","doi":"10.1016/j.neucom.2025.131700","DOIUrl":null,"url":null,"abstract":"<div><div>Transformer-based models have gained significant popularity and demonstrated remarkable performance in long-term time series forecasting. However, existing Transformer-based models are not designed to fully exploit the variation patterns and multiscale information of time series data. Moreover, there is a lack of channel strategy that effectively captures the essential connections between channels for improving the efficiency and accuracy of channel utilization. To overcome these problems, we propose a novel and adaptable architecture, EIAformer, to utilize comprehensive information to enhance the prediction performance. Firstly, hybrid decomposition is proposed to perform different operations on data with different variation patterns using a divide-and-conquer strategy. Then, dynamic patching based on dilated causal convolution is designed to capture multiscale information. Finally, channel fusion based on Granger Causality and DTW distance is constructed to capture the correlation between different channels, and the merged channels are fed into the encoder to perform prediction. Extensive experiments on nine datasets demonstrate that EIAformer achieves superior performance compared to existing Transformer-based models. Meanwhile, the proposed enhancement module as a plug-and-play solution can boost the performance and efficiency of the Transformer family models.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"658 ","pages":"Article 131700"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"EIAformer: Empowering transformer with enhanced information acquisition for time series forecasting\",\"authors\":\"Weina Wang ,&nbsp;Yongjie Wang ,&nbsp;Xiaolong Qi ,&nbsp;Hui Chen\",\"doi\":\"10.1016/j.neucom.2025.131700\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Transformer-based models have gained significant popularity and demonstrated remarkable performance in long-term time series forecasting. However, existing Transformer-based models are not designed to fully exploit the variation patterns and multiscale information of time series data. Moreover, there is a lack of channel strategy that effectively captures the essential connections between channels for improving the efficiency and accuracy of channel utilization. To overcome these problems, we propose a novel and adaptable architecture, EIAformer, to utilize comprehensive information to enhance the prediction performance. Firstly, hybrid decomposition is proposed to perform different operations on data with different variation patterns using a divide-and-conquer strategy. Then, dynamic patching based on dilated causal convolution is designed to capture multiscale information. Finally, channel fusion based on Granger Causality and DTW distance is constructed to capture the correlation between different channels, and the merged channels are fed into the encoder to perform prediction. Extensive experiments on nine datasets demonstrate that EIAformer achieves superior performance compared to existing Transformer-based models. Meanwhile, the proposed enhancement module as a plug-and-play solution can boost the performance and efficiency of the Transformer family models.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"658 \",\"pages\":\"Article 131700\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2025-09-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225023720\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225023720","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

基于变压器的模型在长期时间序列预测中得到了广泛的应用,并表现出了显著的性能。然而,现有的基于变压器的模型并没有充分利用时间序列数据的变化模式和多尺度信息。此外,缺乏有效捕捉渠道之间本质联系的渠道策略,以提高渠道利用的效率和准确性。为了克服这些问题,我们提出了一种新颖的、适应性强的结构EIAformer,利用综合信息来提高预测性能。首先,提出混合分解方法,采用分而治之的策略对具有不同变化模式的数据进行不同的操作;然后,设计了基于扩展因果卷积的动态补丁来捕获多尺度信息。最后,构建基于格兰杰因果关系和DTW距离的信道融合,捕捉不同信道之间的相关性,并将融合后的信道送入编码器进行预测。在9个数据集上的大量实验表明,与现有的基于transformer的模型相比,EIAformer具有优越的性能。同时,提出的增强模块作为即插即用解决方案可以提高Transformer系列模型的性能和效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
EIAformer: Empowering transformer with enhanced information acquisition for time series forecasting
Transformer-based models have gained significant popularity and demonstrated remarkable performance in long-term time series forecasting. However, existing Transformer-based models are not designed to fully exploit the variation patterns and multiscale information of time series data. Moreover, there is a lack of channel strategy that effectively captures the essential connections between channels for improving the efficiency and accuracy of channel utilization. To overcome these problems, we propose a novel and adaptable architecture, EIAformer, to utilize comprehensive information to enhance the prediction performance. Firstly, hybrid decomposition is proposed to perform different operations on data with different variation patterns using a divide-and-conquer strategy. Then, dynamic patching based on dilated causal convolution is designed to capture multiscale information. Finally, channel fusion based on Granger Causality and DTW distance is constructed to capture the correlation between different channels, and the merged channels are fed into the encoder to perform prediction. Extensive experiments on nine datasets demonstrate that EIAformer achieves superior performance compared to existing Transformer-based models. Meanwhile, the proposed enhancement module as a plug-and-play solution can boost the performance and efficiency of the Transformer family models.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信