{"title":"基于连续图学习的多流概念漂移自适应算法","authors":"Ming Zhou;Jie Lu","doi":"10.1109/TCYB.2025.3569816","DOIUrl":null,"url":null,"abstract":"Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.","PeriodicalId":13112,"journal":{"name":"IEEE Transactions on Cybernetics","volume":"55 8","pages":"3760-3773"},"PeriodicalIF":10.5000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Continuous Graph Learning-Based Self-Adaptation for Multi-Stream Concept Drift\",\"authors\":\"Ming Zhou;Jie Lu\",\"doi\":\"10.1109/TCYB.2025.3569816\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.\",\"PeriodicalId\":13112,\"journal\":{\"name\":\"IEEE Transactions on Cybernetics\",\"volume\":\"55 8\",\"pages\":\"3760-3773\"},\"PeriodicalIF\":10.5000,\"publicationDate\":\"2025-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Cybernetics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11028064/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Cybernetics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11028064/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Continuous Graph Learning-Based Self-Adaptation for Multi-Stream Concept Drift
Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.
期刊介绍:
The scope of the IEEE Transactions on Cybernetics includes computational approaches to the field of cybernetics. Specifically, the transactions welcomes papers on communication and control across machines or machine, human, and organizations. The scope includes such areas as computational intelligence, computer vision, neural networks, genetic algorithms, machine learning, fuzzy systems, cognitive systems, decision making, and robotics, to the extent that they contribute to the theme of cybernetics or demonstrate an application of cybernetics principles.