Rui Dai, Zheng Wang, Jing Jie, Wanliang Wang, Qianlin Ye
{"title":"VTformer: a novel multiscale linear transformer forecaster with variate-temporal dependency for multivariate time series","authors":"Rui Dai, Zheng Wang, Jing Jie, Wanliang Wang, Qianlin Ye","doi":"10.1007/s40747-025-01866-0","DOIUrl":null,"url":null,"abstract":"<p>Recently, the prosperity of linear models has raised questions about capturing the sequential capabilities of Transformer forecasters. Although the latest Transformer-based studies have alleviated some of these concerns, the limited information utilization still constrains the model’s comprehensive exploration of complex dependencies, as these forecasters often prioritize global dependence on time stamps and overlook correlations between different variates. To this end, we reflect on the competence of Transformer components and present an efficient lightweight Transformer forecaster named VTformer. Concretely, a Transformer with multiscale linear attention is constructed to mine the global variate correlation and long-term temporal dependence of time series data in parallel, providing multifaceted dynamics for the downstream self-attention mechanism. Moreover, a novel adaptive fusion method is designed to propagate complementary information from the perspective of variate and temporal to promote prediction. Extensive experiments on eight real-world datasets demonstrate that VTformer outperforms state-of-the-art models in long-term Multivariate Time Series Forecasting (MTSF) tasks, thereby advancing the accuracy and efficiency of Transformers.</p>","PeriodicalId":10524,"journal":{"name":"Complex & Intelligent Systems","volume":"47 1","pages":""},"PeriodicalIF":4.6000,"publicationDate":"2025-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Complex & Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1007/s40747-025-01866-0","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, the prosperity of linear models has raised questions about capturing the sequential capabilities of Transformer forecasters. Although the latest Transformer-based studies have alleviated some of these concerns, the limited information utilization still constrains the model’s comprehensive exploration of complex dependencies, as these forecasters often prioritize global dependence on time stamps and overlook correlations between different variates. To this end, we reflect on the competence of Transformer components and present an efficient lightweight Transformer forecaster named VTformer. Concretely, a Transformer with multiscale linear attention is constructed to mine the global variate correlation and long-term temporal dependence of time series data in parallel, providing multifaceted dynamics for the downstream self-attention mechanism. Moreover, a novel adaptive fusion method is designed to propagate complementary information from the perspective of variate and temporal to promote prediction. Extensive experiments on eight real-world datasets demonstrate that VTformer outperforms state-of-the-art models in long-term Multivariate Time Series Forecasting (MTSF) tasks, thereby advancing the accuracy and efficiency of Transformers.
期刊介绍:
Complex & Intelligent Systems aims to provide a forum for presenting and discussing novel approaches, tools and techniques meant for attaining a cross-fertilization between the broad fields of complex systems, computational simulation, and intelligent analytics and visualization. The transdisciplinary research that the journal focuses on will expand the boundaries of our understanding by investigating the principles and processes that underlie many of the most profound problems facing society today.