用于长期时间序列预测的物理引导时间扩散变换器

IF 7.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
{"title":"用于长期时间序列预测的物理引导时间扩散变换器","authors":"","doi":"10.1016/j.knosys.2024.112508","DOIUrl":null,"url":null,"abstract":"<div><p>Transformer has shown excellent performance in long-term time series forecasting because of its capability to capture long-term dependencies. However, existing Transformer-based approaches often overlook the unique characteristics inherent to time series, particularly multi-scale periodicity, which leads to a gap in inductive biases. To address this oversight, the temporal diffusion Transformer (TDT) was developed in this study to reveal the intrinsic evolution processes of time series. First, to uncover the connections among the periods of multi-periodic time series, the series are transformed into various types of patches using a multi-scale Patch method. Inspired by the principles of heat conduction, TDT conceptualizes the evolution of a time series as a diffusion process. TDT aims to achieve global consistency by minimizing energy constraints, which is accomplished through the iterative updating of patches. Finally, the results of these iterations across multiple periods are aggregated to form the TDT output. Compared to previous advanced models, TDT achieved state-of-the-art predictive performance in our experiments. In most datasets, TDT outperformed the baseline model by approximately 2% in terms of mean square error (MSE) and mean absolute error (MAE). Its effectiveness was further validated through ablation, efficiency, and hyperparameter analyses. TDT offers intuitive explanations by elucidating the diffusion process of time series patches throughout the iterative procedure.</p></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":null,"pages":null},"PeriodicalIF":7.2000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Physically-guided temporal diffusion transformer for long-term time series forecasting\",\"authors\":\"\",\"doi\":\"10.1016/j.knosys.2024.112508\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Transformer has shown excellent performance in long-term time series forecasting because of its capability to capture long-term dependencies. However, existing Transformer-based approaches often overlook the unique characteristics inherent to time series, particularly multi-scale periodicity, which leads to a gap in inductive biases. To address this oversight, the temporal diffusion Transformer (TDT) was developed in this study to reveal the intrinsic evolution processes of time series. First, to uncover the connections among the periods of multi-periodic time series, the series are transformed into various types of patches using a multi-scale Patch method. Inspired by the principles of heat conduction, TDT conceptualizes the evolution of a time series as a diffusion process. TDT aims to achieve global consistency by minimizing energy constraints, which is accomplished through the iterative updating of patches. Finally, the results of these iterations across multiple periods are aggregated to form the TDT output. Compared to previous advanced models, TDT achieved state-of-the-art predictive performance in our experiments. In most datasets, TDT outperformed the baseline model by approximately 2% in terms of mean square error (MSE) and mean absolute error (MAE). Its effectiveness was further validated through ablation, efficiency, and hyperparameter analyses. TDT offers intuitive explanations by elucidating the diffusion process of time series patches throughout the iterative procedure.</p></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2024-09-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705124011420\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705124011420","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

由于 Transformer 能够捕捉长期依赖关系,因此在长期时间序列预测方面表现出色。然而,现有的基于 Transformer 的方法往往忽略了时间序列的固有特征,尤其是多尺度周期性,这导致了归纳偏差的差距。为解决这一问题,本研究开发了时间扩散变换器(TDT),以揭示时间序列的内在演化过程。首先,为了揭示多周期时间序列各时期之间的联系,使用多尺度补丁方法将序列转换为各种类型的补丁。受热传导原理的启发,TDT 将时间序列的演变概念化为一个扩散过程。TDT 的目标是通过最小化能量约束来实现全局一致性,而这是通过迭代更新补丁来实现的。最后,这些迭代更新在多个时段的结果汇总形成 TDT 输出。与以前的高级模型相比,TDT 在我们的实验中取得了最先进的预测性能。在大多数数据集中,TDT 的均方误差(MSE)和平均绝对误差(MAE)都比基准模型高出约 2%。通过消融、效率和超参数分析,进一步验证了其有效性。TDT 在整个迭代过程中阐明了时间序列斑块的扩散过程,从而提供了直观的解释。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Physically-guided temporal diffusion transformer for long-term time series forecasting

Transformer has shown excellent performance in long-term time series forecasting because of its capability to capture long-term dependencies. However, existing Transformer-based approaches often overlook the unique characteristics inherent to time series, particularly multi-scale periodicity, which leads to a gap in inductive biases. To address this oversight, the temporal diffusion Transformer (TDT) was developed in this study to reveal the intrinsic evolution processes of time series. First, to uncover the connections among the periods of multi-periodic time series, the series are transformed into various types of patches using a multi-scale Patch method. Inspired by the principles of heat conduction, TDT conceptualizes the evolution of a time series as a diffusion process. TDT aims to achieve global consistency by minimizing energy constraints, which is accomplished through the iterative updating of patches. Finally, the results of these iterations across multiple periods are aggregated to form the TDT output. Compared to previous advanced models, TDT achieved state-of-the-art predictive performance in our experiments. In most datasets, TDT outperformed the baseline model by approximately 2% in terms of mean square error (MSE) and mean absolute error (MAE). Its effectiveness was further validated through ablation, efficiency, and hyperparameter analyses. TDT offers intuitive explanations by elucidating the diffusion process of time series patches throughout the iterative procedure.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信