{"title":"TWIST:利用时窗和稀疏注意力进行交通预测的高效时空变换器","authors":"Peng Wang;Longxi Feng;Wenhao Zhang;Kanghua Hui","doi":"10.1109/JIOT.2025.3561542","DOIUrl":null,"url":null,"abstract":"Accurate traffic prediction is crucial for intelligent transportation systems (ITS). However, recent advancements in deep learning have encountered significant challenges in effectively capturing long-range dependencies and modeling complex intervariable relationships, particularly under real-time processing constraints. These limitations primarily arise from the inherent tradeoff in current model architectures: while some are optimized for short-term forecasting with constrained receptive fields, others prioritize long-term prediction accuracy at the cost of computational efficiency. Moreover, the dynamic and intricate interactions among nodes present significant scalability challenges, increasingly hindering efficient modeling as the road network grows. To address these limitations, we propose temporal window and sparse spatial attention transformer (TWIST), a novel architecture specifically designed for traffic forecasting. The proposed framework incorporates two key innovations: 1) a trend-aware window attention mechanism in the temporal dimension that effectively and efficiently captures multiscale temporal dynamics and 2) a sparse attention mechanism in the spatial dimension that optimizes computational efficiency by selectively focusing on significant spatial nodes. Extensive experiments on eight real-world datasets demonstrate that TWIST consistently outperforms state-of-the-art methods across various scenarios, including regular, long-term, and large-scale traffic forecasting tasks, while maintaining competitive computational efficiency. Our code is publicly available at <uri>https://github.com/STGTraffic/TWIST</uri>.","PeriodicalId":54347,"journal":{"name":"IEEE Internet of Things Journal","volume":"12 14","pages":"26799-26815"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"TWIST: An Efficient Spatial—Temporal Transformer With Temporal Window and Sparse Attention for Traffic Forecasting\",\"authors\":\"Peng Wang;Longxi Feng;Wenhao Zhang;Kanghua Hui\",\"doi\":\"10.1109/JIOT.2025.3561542\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate traffic prediction is crucial for intelligent transportation systems (ITS). However, recent advancements in deep learning have encountered significant challenges in effectively capturing long-range dependencies and modeling complex intervariable relationships, particularly under real-time processing constraints. These limitations primarily arise from the inherent tradeoff in current model architectures: while some are optimized for short-term forecasting with constrained receptive fields, others prioritize long-term prediction accuracy at the cost of computational efficiency. Moreover, the dynamic and intricate interactions among nodes present significant scalability challenges, increasingly hindering efficient modeling as the road network grows. To address these limitations, we propose temporal window and sparse spatial attention transformer (TWIST), a novel architecture specifically designed for traffic forecasting. The proposed framework incorporates two key innovations: 1) a trend-aware window attention mechanism in the temporal dimension that effectively and efficiently captures multiscale temporal dynamics and 2) a sparse attention mechanism in the spatial dimension that optimizes computational efficiency by selectively focusing on significant spatial nodes. Extensive experiments on eight real-world datasets demonstrate that TWIST consistently outperforms state-of-the-art methods across various scenarios, including regular, long-term, and large-scale traffic forecasting tasks, while maintaining competitive computational efficiency. Our code is publicly available at <uri>https://github.com/STGTraffic/TWIST</uri>.\",\"PeriodicalId\":54347,\"journal\":{\"name\":\"IEEE Internet of Things Journal\",\"volume\":\"12 14\",\"pages\":\"26799-26815\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Internet of Things Journal\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10966151/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Internet of Things Journal","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10966151/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
TWIST: An Efficient Spatial—Temporal Transformer With Temporal Window and Sparse Attention for Traffic Forecasting
Accurate traffic prediction is crucial for intelligent transportation systems (ITS). However, recent advancements in deep learning have encountered significant challenges in effectively capturing long-range dependencies and modeling complex intervariable relationships, particularly under real-time processing constraints. These limitations primarily arise from the inherent tradeoff in current model architectures: while some are optimized for short-term forecasting with constrained receptive fields, others prioritize long-term prediction accuracy at the cost of computational efficiency. Moreover, the dynamic and intricate interactions among nodes present significant scalability challenges, increasingly hindering efficient modeling as the road network grows. To address these limitations, we propose temporal window and sparse spatial attention transformer (TWIST), a novel architecture specifically designed for traffic forecasting. The proposed framework incorporates two key innovations: 1) a trend-aware window attention mechanism in the temporal dimension that effectively and efficiently captures multiscale temporal dynamics and 2) a sparse attention mechanism in the spatial dimension that optimizes computational efficiency by selectively focusing on significant spatial nodes. Extensive experiments on eight real-world datasets demonstrate that TWIST consistently outperforms state-of-the-art methods across various scenarios, including regular, long-term, and large-scale traffic forecasting tasks, while maintaining competitive computational efficiency. Our code is publicly available at https://github.com/STGTraffic/TWIST.
期刊介绍:
The EEE Internet of Things (IoT) Journal publishes articles and review articles covering various aspects of IoT, including IoT system architecture, IoT enabling technologies, IoT communication and networking protocols such as network coding, and IoT services and applications. Topics encompass IoT's impacts on sensor technologies, big data management, and future internet design for applications like smart cities and smart homes. Fields of interest include IoT architecture such as things-centric, data-centric, service-oriented IoT architecture; IoT enabling technologies and systematic integration such as sensor technologies, big sensor data management, and future Internet design for IoT; IoT services, applications, and test-beds such as IoT service middleware, IoT application programming interface (API), IoT application design, and IoT trials/experiments; IoT standardization activities and technology development in different standard development organizations (SDO) such as IEEE, IETF, ITU, 3GPP, ETSI, etc.