{"title":"JCCMTM: Joint channel-independent and channel-dependent strategy for masked multivariate time-series modeling.","authors":"Qi Li, Zhenyu Zhang, Yong Zhang, Zhao Zhang, Lin Zhu, Xiaolei Hua, Renkai Yu, Xinwen Fan, Zhe Lei, Junlan Feng","doi":"10.1016/j.neunet.2025.107922","DOIUrl":null,"url":null,"abstract":"<p><p>Multivariate time series (MTS) modeling has become omnipresent in extensive areas. A primary challenge in MTS data modeling is capturing the intricate series dependencies. Mainstream modeling strategies include channel-independent (CI), channel-dependent (CD), and their joint versions. Recently, supervised frameworks based on joint strategies have achieved remarkable success in MTS modeling, but they are typically designed for specific tasks. In contrast, self-supervised pre-training frameworks have shown promise in Masked Time-series Modeling (MTM) for benefiting various tasks. However, existing frameworks often overlook the inter-series dependencies across time, referred to as cross-series dependencies, in MTS. This paper thus presents JCCMTM, a Joint CI and CD (JCC) strategy-based pre-training framework for MTM. JCCMTM leverages both intra-series and cross-series dependencies in MTS data to reconstruct masked time-series segments, encouraging the model to focus on relationships between channels. To effectively model cross-series dependencies, we propose the Time-Series-as-Sentence (TSaS), which incorporates cross-series contextual information of MTS segments. Furthermore, JCCMTM introduces a novel embedding transformation paradigm, the Uni-Mul Transformation, to address the embedding alignment issues that arise when applying JCC to MTM. Additionally, two optimization schemes, based on sparse attention and global tokens, respectively, are proposed to reduce JCCMTM's computational complexity. Experimentally, JCCMTM demonstrates outstanding fine-tuning performance compared to the most advanced time series supervised and pre-training methods in two canonical time series analysis tasks: long-term forecasting and anomaly detection. The code for JCCMTM is available at https://github.com/Torea-L/JCCMTM.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"192 ","pages":"107922"},"PeriodicalIF":6.3000,"publicationDate":"2025-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2025.107922","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multivariate time series (MTS) modeling has become omnipresent in extensive areas. A primary challenge in MTS data modeling is capturing the intricate series dependencies. Mainstream modeling strategies include channel-independent (CI), channel-dependent (CD), and their joint versions. Recently, supervised frameworks based on joint strategies have achieved remarkable success in MTS modeling, but they are typically designed for specific tasks. In contrast, self-supervised pre-training frameworks have shown promise in Masked Time-series Modeling (MTM) for benefiting various tasks. However, existing frameworks often overlook the inter-series dependencies across time, referred to as cross-series dependencies, in MTS. This paper thus presents JCCMTM, a Joint CI and CD (JCC) strategy-based pre-training framework for MTM. JCCMTM leverages both intra-series and cross-series dependencies in MTS data to reconstruct masked time-series segments, encouraging the model to focus on relationships between channels. To effectively model cross-series dependencies, we propose the Time-Series-as-Sentence (TSaS), which incorporates cross-series contextual information of MTS segments. Furthermore, JCCMTM introduces a novel embedding transformation paradigm, the Uni-Mul Transformation, to address the embedding alignment issues that arise when applying JCC to MTM. Additionally, two optimization schemes, based on sparse attention and global tokens, respectively, are proposed to reduce JCCMTM's computational complexity. Experimentally, JCCMTM demonstrates outstanding fine-tuning performance compared to the most advanced time series supervised and pre-training methods in two canonical time series analysis tasks: long-term forecasting and anomaly detection. The code for JCCMTM is available at https://github.com/Torea-L/JCCMTM.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.