屏蔽多变量时间序列建模的信道独立和信道依赖联合策略。

IF 6.3 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Qi Li, Zhenyu Zhang, Yong Zhang, Zhao Zhang, Lin Zhu, Xiaolei Hua, Renkai Yu, Xinwen Fan, Zhe Lei, Junlan Feng
{"title":"屏蔽多变量时间序列建模的信道独立和信道依赖联合策略。","authors":"Qi Li, Zhenyu Zhang, Yong Zhang, Zhao Zhang, Lin Zhu, Xiaolei Hua, Renkai Yu, Xinwen Fan, Zhe Lei, Junlan Feng","doi":"10.1016/j.neunet.2025.107922","DOIUrl":null,"url":null,"abstract":"<p><p>Multivariate time series (MTS) modeling has become omnipresent in extensive areas. A primary challenge in MTS data modeling is capturing the intricate series dependencies. Mainstream modeling strategies include channel-independent (CI), channel-dependent (CD), and their joint versions. Recently, supervised frameworks based on joint strategies have achieved remarkable success in MTS modeling, but they are typically designed for specific tasks. In contrast, self-supervised pre-training frameworks have shown promise in Masked Time-series Modeling (MTM) for benefiting various tasks. However, existing frameworks often overlook the inter-series dependencies across time, referred to as cross-series dependencies, in MTS. This paper thus presents JCCMTM, a Joint CI and CD (JCC) strategy-based pre-training framework for MTM. JCCMTM leverages both intra-series and cross-series dependencies in MTS data to reconstruct masked time-series segments, encouraging the model to focus on relationships between channels. To effectively model cross-series dependencies, we propose the Time-Series-as-Sentence (TSaS), which incorporates cross-series contextual information of MTS segments. Furthermore, JCCMTM introduces a novel embedding transformation paradigm, the Uni-Mul Transformation, to address the embedding alignment issues that arise when applying JCC to MTM. Additionally, two optimization schemes, based on sparse attention and global tokens, respectively, are proposed to reduce JCCMTM's computational complexity. Experimentally, JCCMTM demonstrates outstanding fine-tuning performance compared to the most advanced time series supervised and pre-training methods in two canonical time series analysis tasks: long-term forecasting and anomaly detection. The code for JCCMTM is available at https://github.com/Torea-L/JCCMTM.</p>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"192 ","pages":"107922"},"PeriodicalIF":6.3000,"publicationDate":"2025-08-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"JCCMTM: Joint channel-independent and channel-dependent strategy for masked multivariate time-series modeling.\",\"authors\":\"Qi Li, Zhenyu Zhang, Yong Zhang, Zhao Zhang, Lin Zhu, Xiaolei Hua, Renkai Yu, Xinwen Fan, Zhe Lei, Junlan Feng\",\"doi\":\"10.1016/j.neunet.2025.107922\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Multivariate time series (MTS) modeling has become omnipresent in extensive areas. A primary challenge in MTS data modeling is capturing the intricate series dependencies. Mainstream modeling strategies include channel-independent (CI), channel-dependent (CD), and their joint versions. Recently, supervised frameworks based on joint strategies have achieved remarkable success in MTS modeling, but they are typically designed for specific tasks. In contrast, self-supervised pre-training frameworks have shown promise in Masked Time-series Modeling (MTM) for benefiting various tasks. However, existing frameworks often overlook the inter-series dependencies across time, referred to as cross-series dependencies, in MTS. This paper thus presents JCCMTM, a Joint CI and CD (JCC) strategy-based pre-training framework for MTM. JCCMTM leverages both intra-series and cross-series dependencies in MTS data to reconstruct masked time-series segments, encouraging the model to focus on relationships between channels. To effectively model cross-series dependencies, we propose the Time-Series-as-Sentence (TSaS), which incorporates cross-series contextual information of MTS segments. Furthermore, JCCMTM introduces a novel embedding transformation paradigm, the Uni-Mul Transformation, to address the embedding alignment issues that arise when applying JCC to MTM. Additionally, two optimization schemes, based on sparse attention and global tokens, respectively, are proposed to reduce JCCMTM's computational complexity. Experimentally, JCCMTM demonstrates outstanding fine-tuning performance compared to the most advanced time series supervised and pre-training methods in two canonical time series analysis tasks: long-term forecasting and anomaly detection. The code for JCCMTM is available at https://github.com/Torea-L/JCCMTM.</p>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"192 \",\"pages\":\"107922\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-08-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1016/j.neunet.2025.107922\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1016/j.neunet.2025.107922","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

多变量时间序列(MTS)建模已经在广泛的领域中无处不在。MTS数据建模中的一个主要挑战是捕获复杂的序列依赖关系。主流的建模策略包括通道独立(CI)、通道依赖(CD)以及它们的联合版本。最近,基于联合策略的监督框架在MTS建模中取得了显著的成功,但它们通常是为特定的任务而设计的。相比之下,自监督预训练框架在掩膜时间序列建模(MTM)中显示出了对各种任务有利的前景。然而,现有的框架往往忽略了MTS中跨时间的系列依赖关系,即跨系列依赖关系。因此,本文提出了JCCMTM,一个基于联合CI和CD (JCC)策略的MTM预训练框架。JCCMTM利用MTS数据中的序列内和跨序列依赖关系来重建被屏蔽的时间序列片段,从而鼓励模型关注通道之间的关系。为了有效地建模跨序列依赖关系,我们提出了时间序列即句子(TSaS),它包含了MTS片段的跨序列上下文信息。此外,JCCMTM引入了一种新的嵌入转换范例,uni - multi转换,以解决在将JCC应用于MTM时出现的嵌入对齐问题。此外,为了降低JCCMTM的计算复杂度,分别提出了基于稀疏关注和全局令牌的两种优化方案。实验结果表明,与最先进的时间序列监督和预训练方法相比,JCCMTM在两个典型的时间序列分析任务(长期预测和异常检测)中表现出了出色的微调性能。JCCMTM的代码可从https://github.com/Torea-L/JCCMTM获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
JCCMTM: Joint channel-independent and channel-dependent strategy for masked multivariate time-series modeling.

Multivariate time series (MTS) modeling has become omnipresent in extensive areas. A primary challenge in MTS data modeling is capturing the intricate series dependencies. Mainstream modeling strategies include channel-independent (CI), channel-dependent (CD), and their joint versions. Recently, supervised frameworks based on joint strategies have achieved remarkable success in MTS modeling, but they are typically designed for specific tasks. In contrast, self-supervised pre-training frameworks have shown promise in Masked Time-series Modeling (MTM) for benefiting various tasks. However, existing frameworks often overlook the inter-series dependencies across time, referred to as cross-series dependencies, in MTS. This paper thus presents JCCMTM, a Joint CI and CD (JCC) strategy-based pre-training framework for MTM. JCCMTM leverages both intra-series and cross-series dependencies in MTS data to reconstruct masked time-series segments, encouraging the model to focus on relationships between channels. To effectively model cross-series dependencies, we propose the Time-Series-as-Sentence (TSaS), which incorporates cross-series contextual information of MTS segments. Furthermore, JCCMTM introduces a novel embedding transformation paradigm, the Uni-Mul Transformation, to address the embedding alignment issues that arise when applying JCC to MTM. Additionally, two optimization schemes, based on sparse attention and global tokens, respectively, are proposed to reduce JCCMTM's computational complexity. Experimentally, JCCMTM demonstrates outstanding fine-tuning performance compared to the most advanced time series supervised and pre-training methods in two canonical time series analysis tasks: long-term forecasting and anomaly detection. The code for JCCMTM is available at https://github.com/Torea-L/JCCMTM.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信