Ziyi Xiao , Cong Luo , Jiajia Hu , Guodong Sa , Yueyang Wang
{"title":"Exploring dual-view graph structures: Contrastive learning with graph and hypergraph for multivariate time series classification","authors":"Ziyi Xiao , Cong Luo , Jiajia Hu , Guodong Sa , Yueyang Wang","doi":"10.1016/j.neunet.2025.107859","DOIUrl":null,"url":null,"abstract":"<div><div>Multivariate time series classification involves not only extracting temporal information but also uncovering the relationships between multiple variables. Graph-based methods have gained attention for their ability to extract temporal information and directly model relationships between variables. However,these methods primarily focus on low-order pairwise relationships between variables, neglecting high-order multivariate non-pairwise relationships, which results in an incomplete capture of inter-variable dependencies. Additionally, the complexity of graph structures can lead to noise information, making it challenging to distinguish key local aggregation information. To address these challenges, we propose the <strong>DVG-CL</strong> model, a <strong>D</strong>ual-<strong>V</strong>iew <strong>G</strong>raph-structured <strong>C</strong>ontrastive <strong>L</strong>earning framework that models MTS as both a graph and a hypergraph, capturing both low-order pairwise and high-order non-pairwise relationships among variables. We also introduce a cross-view contrasting loss that facilitates the synergistic interaction of variable relationships across different levels, and a local-global mutual information loss, which maximizes both local and global mutual information to filter out noise and identify the most critical local aggregation information. Our experiments on 11 UEA datasets demonstrate that DVG-CL outperforms existing self-supervised learning baselines and validates the effectiveness of its components.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"192 ","pages":"Article 107859"},"PeriodicalIF":6.3000,"publicationDate":"2025-07-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025007397","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Multivariate time series classification involves not only extracting temporal information but also uncovering the relationships between multiple variables. Graph-based methods have gained attention for their ability to extract temporal information and directly model relationships between variables. However,these methods primarily focus on low-order pairwise relationships between variables, neglecting high-order multivariate non-pairwise relationships, which results in an incomplete capture of inter-variable dependencies. Additionally, the complexity of graph structures can lead to noise information, making it challenging to distinguish key local aggregation information. To address these challenges, we propose the DVG-CL model, a Dual-View Graph-structured Contrastive Learning framework that models MTS as both a graph and a hypergraph, capturing both low-order pairwise and high-order non-pairwise relationships among variables. We also introduce a cross-view contrasting loss that facilitates the synergistic interaction of variable relationships across different levels, and a local-global mutual information loss, which maximizes both local and global mutual information to filter out noise and identify the most critical local aggregation information. Our experiments on 11 UEA datasets demonstrate that DVG-CL outperforms existing self-supervised learning baselines and validates the effectiveness of its components.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.