A novel dual-channel model with adaptive multi-scale attention for time series forecasting

IF 8 2区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
Shuqing Wang , Jinghao Lu , Ren Wang , Xiaofeng Zhang , Hua Wang , Yujuan Sun
{"title":"A novel dual-channel model with adaptive multi-scale attention for time series forecasting","authors":"Shuqing Wang ,&nbsp;Jinghao Lu ,&nbsp;Ren Wang ,&nbsp;Xiaofeng Zhang ,&nbsp;Hua Wang ,&nbsp;Yujuan Sun","doi":"10.1016/j.engappai.2025.112803","DOIUrl":null,"url":null,"abstract":"<div><div>Time series forecasting plays a crucial role in various domains, including finance, traffic management, energy, and healthcare. However, as application scenarios continue to expand, the complexity of time series data has significantly increased, posing substantial challenges in capturing trend fluctuations of multivariate features and the dynamic relationships among them. To address these issues, this paper proposes a novel architecture–DASformer (<strong>D</strong>ual-Channel model with <strong>A</strong>daptive multi-<strong>S</strong>cale attention) - which enhances time series analysis by leveraging a dual-channel multivariate extractor and an adaptive multi-scale attention mechanism. Specifically, the dual-channel multivariate extractor comprises two independent yet interactive streams, focusing on capturing information at different levels of the time series, thereby effectively decoupling complex dynamic relationships. Moreover, to alleviate the problem of feature forgetting and loss in the long-term trend stream, the model incorporates an adaptive multi-scale attention module. This module adopts multi-scale processing and a dynamic weighting mechanism to learn dependencies across different scales and effectively capture their dynamic variations. Experimental results show that DASformer consistently achieves state-of-the-art performance on nine widely used benchmark datasets, delivering superior prediction accuracy, particularly in long-term forecasting tasks. The source code is available at: <span><span>https://github.com/LDU-TSA/DASformer</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50523,"journal":{"name":"Engineering Applications of Artificial Intelligence","volume":"162 ","pages":"Article 112803"},"PeriodicalIF":8.0000,"publicationDate":"2025-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Applications of Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0952197625028349","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Time series forecasting plays a crucial role in various domains, including finance, traffic management, energy, and healthcare. However, as application scenarios continue to expand, the complexity of time series data has significantly increased, posing substantial challenges in capturing trend fluctuations of multivariate features and the dynamic relationships among them. To address these issues, this paper proposes a novel architecture–DASformer (Dual-Channel model with Adaptive multi-Scale attention) - which enhances time series analysis by leveraging a dual-channel multivariate extractor and an adaptive multi-scale attention mechanism. Specifically, the dual-channel multivariate extractor comprises two independent yet interactive streams, focusing on capturing information at different levels of the time series, thereby effectively decoupling complex dynamic relationships. Moreover, to alleviate the problem of feature forgetting and loss in the long-term trend stream, the model incorporates an adaptive multi-scale attention module. This module adopts multi-scale processing and a dynamic weighting mechanism to learn dependencies across different scales and effectively capture their dynamic variations. Experimental results show that DASformer consistently achieves state-of-the-art performance on nine widely used benchmark datasets, delivering superior prediction accuracy, particularly in long-term forecasting tasks. The source code is available at: https://github.com/LDU-TSA/DASformer.
一种具有自适应多尺度关注的双通道时间序列预测模型
时间序列预测在金融、交通管理、能源和医疗保健等各个领域发挥着至关重要的作用。然而,随着应用场景的不断扩展,时间序列数据的复杂性显著增加,为捕捉多变量特征的趋势波动和它们之间的动态关系带来了巨大挑战。为了解决这些问题,本文提出了一种新的架构- dasformer(双通道模型与自适应多尺度注意)-它通过利用双通道多变量提取器和自适应多尺度注意机制来增强时间序列分析。具体而言,双通道多元提取器包括两个独立但交互的流,专注于捕获时间序列不同层次的信息,从而有效地解耦复杂的动态关系。此外,为了缓解长期趋势流中的特征遗忘和丢失问题,该模型引入了自适应多尺度注意模块。该模块采用多尺度处理和动态加权机制,学习不同尺度间的依赖关系,有效捕捉其动态变化。实验结果表明,DASformer在9个广泛使用的基准数据集上始终如一地达到了最先进的性能,提供了卓越的预测精度,特别是在长期预测任务中。源代码可从https://github.com/LDU-TSA/DASformer获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Engineering Applications of Artificial Intelligence
Engineering Applications of Artificial Intelligence 工程技术-工程:电子与电气
CiteScore
9.60
自引率
10.00%
发文量
505
审稿时长
68 days
期刊介绍: Artificial Intelligence (AI) is pivotal in driving the fourth industrial revolution, witnessing remarkable advancements across various machine learning methodologies. AI techniques have become indispensable tools for practicing engineers, enabling them to tackle previously insurmountable challenges. Engineering Applications of Artificial Intelligence serves as a global platform for the swift dissemination of research elucidating the practical application of AI methods across all engineering disciplines. Submitted papers are expected to present novel aspects of AI utilized in real-world engineering applications, validated using publicly available datasets to ensure the replicability of research outcomes. Join us in exploring the transformative potential of AI in engineering.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信