Honghao Liu , Yining Diao , Ke Sun , Zhaolin Wan , Zhiyang Li
{"title":"AMSFormer: A transformer with adaptive multi-scale partitioning and multi-level spectral filtering for time-series forecasting","authors":"Honghao Liu , Yining Diao , Ke Sun , Zhaolin Wan , Zhiyang Li","doi":"10.1016/j.neucom.2025.130067","DOIUrl":null,"url":null,"abstract":"<div><div>Time series forecasting is essential in numerous real-world scenarios, yet it remains a challenging task due to complex temporal dependencies and the coexistence of multi-scale, periodic, and non-periodic patterns. In this paper, we present an innovative approach called the Adaptive Multi-Scale Transformer (AMSFormer) to tackle these challenges. AMSFormer integrates both frequency-domain and time-domain information to model local and global features collaboratively. By leveraging Fourier transforms, AMSFormer adaptively partitions time series data into patches that align with the data’s intrinsic patterns, enabling dynamic and efficient processing. A convolutional attention mechanism is employed to extract fine-grained local features at multiple scales while maintaining low computational overhead. For global feature extraction, AMSFormer utilizes a hierarchical frequency-domain filter to isolate key periodic components and suppress noise, enhancing the stability and accuracy of global pattern modeling. Extensive experiments on several real-world benchmark datasets demonstrate that AMSFormer consistently outperforms state-of-the-art models, highlighting its robust generalization ability and wide applicability across various forecasting tasks.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"637 ","pages":"Article 130067"},"PeriodicalIF":5.5000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225007398","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Time series forecasting is essential in numerous real-world scenarios, yet it remains a challenging task due to complex temporal dependencies and the coexistence of multi-scale, periodic, and non-periodic patterns. In this paper, we present an innovative approach called the Adaptive Multi-Scale Transformer (AMSFormer) to tackle these challenges. AMSFormer integrates both frequency-domain and time-domain information to model local and global features collaboratively. By leveraging Fourier transforms, AMSFormer adaptively partitions time series data into patches that align with the data’s intrinsic patterns, enabling dynamic and efficient processing. A convolutional attention mechanism is employed to extract fine-grained local features at multiple scales while maintaining low computational overhead. For global feature extraction, AMSFormer utilizes a hierarchical frequency-domain filter to isolate key periodic components and suppress noise, enhancing the stability and accuracy of global pattern modeling. Extensive experiments on several real-world benchmark datasets demonstrate that AMSFormer consistently outperforms state-of-the-art models, highlighting its robust generalization ability and wide applicability across various forecasting tasks.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.