Zhongju Wang , Zhenhong Sun , Yatao Bian , Huadong Mo , Daoyi Dong
{"title":"学习长期时间序列预测的分层时频表示","authors":"Zhongju Wang , Zhenhong Sun , Yatao Bian , Huadong Mo , Daoyi Dong","doi":"10.1016/j.ipm.2025.104358","DOIUrl":null,"url":null,"abstract":"<div><div>Time series forecasting is essential for planning and management across various domains. Existing models struggle to maintain long-term trends in extended predictions and overlook the interplay between time and frequency-domain dependencies. To address these challenges, we propose TFformer, a hierarchical time–frequency representation architecture with Transformer, involving two key innovations: (i) spectrum decomposition isolates long-term patterns from short-term fluctuations and (ii) sequence aggregation integrates two categories of features distinguished by different energy intensities in a hierarchical manner. Experiments on six real-world datasets show that TFformer outperforms the frequency-domain baseline (FreTS) with an average 16.54% improvement in Mean Squared Error (MSE) and surpasses the time-domain baseline (iTransformer) with an average 5.91% MSE improvement, highlighting its effectiveness in capturing both time and frequency-domain patterns.</div></div>","PeriodicalId":50365,"journal":{"name":"Information Processing & Management","volume":"63 2","pages":"Article 104358"},"PeriodicalIF":6.9000,"publicationDate":"2025-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Learning hierarchical time–frequency representation for long-term time series forecasting\",\"authors\":\"Zhongju Wang , Zhenhong Sun , Yatao Bian , Huadong Mo , Daoyi Dong\",\"doi\":\"10.1016/j.ipm.2025.104358\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Time series forecasting is essential for planning and management across various domains. Existing models struggle to maintain long-term trends in extended predictions and overlook the interplay between time and frequency-domain dependencies. To address these challenges, we propose TFformer, a hierarchical time–frequency representation architecture with Transformer, involving two key innovations: (i) spectrum decomposition isolates long-term patterns from short-term fluctuations and (ii) sequence aggregation integrates two categories of features distinguished by different energy intensities in a hierarchical manner. Experiments on six real-world datasets show that TFformer outperforms the frequency-domain baseline (FreTS) with an average 16.54% improvement in Mean Squared Error (MSE) and surpasses the time-domain baseline (iTransformer) with an average 5.91% MSE improvement, highlighting its effectiveness in capturing both time and frequency-domain patterns.</div></div>\",\"PeriodicalId\":50365,\"journal\":{\"name\":\"Information Processing & Management\",\"volume\":\"63 2\",\"pages\":\"Article 104358\"},\"PeriodicalIF\":6.9000,\"publicationDate\":\"2025-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information Processing & Management\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0306457325002997\",\"RegionNum\":1,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Processing & Management","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0306457325002997","RegionNum":1,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Learning hierarchical time–frequency representation for long-term time series forecasting
Time series forecasting is essential for planning and management across various domains. Existing models struggle to maintain long-term trends in extended predictions and overlook the interplay between time and frequency-domain dependencies. To address these challenges, we propose TFformer, a hierarchical time–frequency representation architecture with Transformer, involving two key innovations: (i) spectrum decomposition isolates long-term patterns from short-term fluctuations and (ii) sequence aggregation integrates two categories of features distinguished by different energy intensities in a hierarchical manner. Experiments on six real-world datasets show that TFformer outperforms the frequency-domain baseline (FreTS) with an average 16.54% improvement in Mean Squared Error (MSE) and surpasses the time-domain baseline (iTransformer) with an average 5.91% MSE improvement, highlighting its effectiveness in capturing both time and frequency-domain patterns.
期刊介绍:
Information Processing and Management is dedicated to publishing cutting-edge original research at the convergence of computing and information science. Our scope encompasses theory, methods, and applications across various domains, including advertising, business, health, information science, information technology marketing, and social computing.
We aim to cater to the interests of both primary researchers and practitioners by offering an effective platform for the timely dissemination of advanced and topical issues in this interdisciplinary field. The journal places particular emphasis on original research articles, research survey articles, research method articles, and articles addressing critical applications of research. Join us in advancing knowledge and innovation at the intersection of computing and information science.