{"title":"AFEformer:用于时间序列预测的自适应频率增强变压器","authors":"Zhiyong An, Lanlan Dong","doi":"10.1016/j.engappai.2025.112736","DOIUrl":null,"url":null,"abstract":"<div><div>Long-term time series forecasting (LTSF), as a key research domain with pervasive applications in real-world scenarios, has garnered sustained interest from both academic and industrial communities. Although transformer-based models have demonstrated high predictive capability in capturing long-term temporal dependencies, most of them directly process raw data in the time domain while ignoring the representation of features in the frequency domain. Additionally, transformer models with frequency domain often learn weights directly but overlook frequency statistics for time series, leading to the impact of low-quality interference frequencies. Moreover, Transformer’s self-attention captures correlations solely within sequences but neglects correlations among different sequences, increasing susceptibility to overfitting. To address these issues, we innovatively design an adaptive frequency enhancement transformer (AFEformer) with temporal external attention for time series forecasting, which focuses on enhancing important frequency domain features to provide more accurate forecasting. Specifically, a frequency domain enhancement module with an adaptive threshold strategy is proposed , using frequency statistics to selectively extract key spectral components and strengthen frequency domain features. Furthermore, the temporal external attention enhancement module with Infinite Norm and dropout layer is presented to explore potential correlations between different sample sequences and mitigate overfitting. Regarding long-term forecasting, comprehensive experiments demonstrate that AFEformer achieves state-of-the-art forecasting performance on nine time series forecasting benchmarks.</div></div>","PeriodicalId":50523,"journal":{"name":"Engineering Applications of Artificial Intelligence","volume":"163 ","pages":"Article 112736"},"PeriodicalIF":8.0000,"publicationDate":"2025-10-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"AFEformer: An adaptive frequency enhancement transformer for time series prediction\",\"authors\":\"Zhiyong An, Lanlan Dong\",\"doi\":\"10.1016/j.engappai.2025.112736\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Long-term time series forecasting (LTSF), as a key research domain with pervasive applications in real-world scenarios, has garnered sustained interest from both academic and industrial communities. Although transformer-based models have demonstrated high predictive capability in capturing long-term temporal dependencies, most of them directly process raw data in the time domain while ignoring the representation of features in the frequency domain. Additionally, transformer models with frequency domain often learn weights directly but overlook frequency statistics for time series, leading to the impact of low-quality interference frequencies. Moreover, Transformer’s self-attention captures correlations solely within sequences but neglects correlations among different sequences, increasing susceptibility to overfitting. To address these issues, we innovatively design an adaptive frequency enhancement transformer (AFEformer) with temporal external attention for time series forecasting, which focuses on enhancing important frequency domain features to provide more accurate forecasting. Specifically, a frequency domain enhancement module with an adaptive threshold strategy is proposed , using frequency statistics to selectively extract key spectral components and strengthen frequency domain features. Furthermore, the temporal external attention enhancement module with Infinite Norm and dropout layer is presented to explore potential correlations between different sample sequences and mitigate overfitting. Regarding long-term forecasting, comprehensive experiments demonstrate that AFEformer achieves state-of-the-art forecasting performance on nine time series forecasting benchmarks.</div></div>\",\"PeriodicalId\":50523,\"journal\":{\"name\":\"Engineering Applications of Artificial Intelligence\",\"volume\":\"163 \",\"pages\":\"Article 112736\"},\"PeriodicalIF\":8.0000,\"publicationDate\":\"2025-10-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Engineering Applications of Artificial Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0952197625027678\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Applications of Artificial Intelligence","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0952197625027678","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
AFEformer: An adaptive frequency enhancement transformer for time series prediction
Long-term time series forecasting (LTSF), as a key research domain with pervasive applications in real-world scenarios, has garnered sustained interest from both academic and industrial communities. Although transformer-based models have demonstrated high predictive capability in capturing long-term temporal dependencies, most of them directly process raw data in the time domain while ignoring the representation of features in the frequency domain. Additionally, transformer models with frequency domain often learn weights directly but overlook frequency statistics for time series, leading to the impact of low-quality interference frequencies. Moreover, Transformer’s self-attention captures correlations solely within sequences but neglects correlations among different sequences, increasing susceptibility to overfitting. To address these issues, we innovatively design an adaptive frequency enhancement transformer (AFEformer) with temporal external attention for time series forecasting, which focuses on enhancing important frequency domain features to provide more accurate forecasting. Specifically, a frequency domain enhancement module with an adaptive threshold strategy is proposed , using frequency statistics to selectively extract key spectral components and strengthen frequency domain features. Furthermore, the temporal external attention enhancement module with Infinite Norm and dropout layer is presented to explore potential correlations between different sample sequences and mitigate overfitting. Regarding long-term forecasting, comprehensive experiments demonstrate that AFEformer achieves state-of-the-art forecasting performance on nine time series forecasting benchmarks.
期刊介绍:
Artificial Intelligence (AI) is pivotal in driving the fourth industrial revolution, witnessing remarkable advancements across various machine learning methodologies. AI techniques have become indispensable tools for practicing engineers, enabling them to tackle previously insurmountable challenges. Engineering Applications of Artificial Intelligence serves as a global platform for the swift dissemination of research elucidating the practical application of AI methods across all engineering disciplines. Submitted papers are expected to present novel aspects of AI utilized in real-world engineering applications, validated using publicly available datasets to ensure the replicability of research outcomes. Join us in exploring the transformative potential of AI in engineering.