Jiacheng Li , Wei Chen , Yican Liu , Junmei Yang , Zhiheng Zhou , Delu Zeng
{"title":"扩散信息器:用于长序列时间序列预测的扩散信息器模型","authors":"Jiacheng Li , Wei Chen , Yican Liu , Junmei Yang , Zhiheng Zhou , Delu Zeng","doi":"10.1016/j.eswa.2025.129944","DOIUrl":null,"url":null,"abstract":"<div><div>Long sequence time-series forecasting (LSTF) is a significant research area with wide-ranging applications in energy, transportation, meteorology, and finance. Current methods primarily rely on statistical machine learning and deep neural network techniques to model historical time series for long-term forecasting. In recent years, Transformer-based models have demonstrated outstanding performance in forecasting, but their high computational costs limit their application. The Informer model addresses issues of high computational complexity and the management of long sequence inputs and outputs. However, existing models still face prediction bottlenecks under limited computational resources. The powerful generative capability of diffusion models can significantly enhance time series forecasting. We propose the Diffinformer model, which utilizes generative models for forecasting. Specifically, it combines conditional diffusion models with the ProbSparse self-attention distilling mechanism of Informer and incorporates the output of the diffusion model into the decoder to capture distant dependencies of observations from the perspective of dynamic systems. Comprehensive experimental results across five large-scale datasets demonstrate that Diffinformer improves predictive accuracy and outperforms corresponding baselines, offering a novel solution to the LSTF problem.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"299 ","pages":"Article 129944"},"PeriodicalIF":7.5000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Diffinformer: Diffusion informer model for long sequence time-series forecasting\",\"authors\":\"Jiacheng Li , Wei Chen , Yican Liu , Junmei Yang , Zhiheng Zhou , Delu Zeng\",\"doi\":\"10.1016/j.eswa.2025.129944\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Long sequence time-series forecasting (LSTF) is a significant research area with wide-ranging applications in energy, transportation, meteorology, and finance. Current methods primarily rely on statistical machine learning and deep neural network techniques to model historical time series for long-term forecasting. In recent years, Transformer-based models have demonstrated outstanding performance in forecasting, but their high computational costs limit their application. The Informer model addresses issues of high computational complexity and the management of long sequence inputs and outputs. However, existing models still face prediction bottlenecks under limited computational resources. The powerful generative capability of diffusion models can significantly enhance time series forecasting. We propose the Diffinformer model, which utilizes generative models for forecasting. Specifically, it combines conditional diffusion models with the ProbSparse self-attention distilling mechanism of Informer and incorporates the output of the diffusion model into the decoder to capture distant dependencies of observations from the perspective of dynamic systems. Comprehensive experimental results across five large-scale datasets demonstrate that Diffinformer improves predictive accuracy and outperforms corresponding baselines, offering a novel solution to the LSTF problem.</div></div>\",\"PeriodicalId\":50461,\"journal\":{\"name\":\"Expert Systems with Applications\",\"volume\":\"299 \",\"pages\":\"Article 129944\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2025-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Expert Systems with Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0957417425035596\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425035596","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Diffinformer: Diffusion informer model for long sequence time-series forecasting
Long sequence time-series forecasting (LSTF) is a significant research area with wide-ranging applications in energy, transportation, meteorology, and finance. Current methods primarily rely on statistical machine learning and deep neural network techniques to model historical time series for long-term forecasting. In recent years, Transformer-based models have demonstrated outstanding performance in forecasting, but their high computational costs limit their application. The Informer model addresses issues of high computational complexity and the management of long sequence inputs and outputs. However, existing models still face prediction bottlenecks under limited computational resources. The powerful generative capability of diffusion models can significantly enhance time series forecasting. We propose the Diffinformer model, which utilizes generative models for forecasting. Specifically, it combines conditional diffusion models with the ProbSparse self-attention distilling mechanism of Informer and incorporates the output of the diffusion model into the decoder to capture distant dependencies of observations from the perspective of dynamic systems. Comprehensive experimental results across five large-scale datasets demonstrate that Diffinformer improves predictive accuracy and outperforms corresponding baselines, offering a novel solution to the LSTF problem.
期刊介绍:
Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.