基于连续图学习的多流概念漂移自适应算法

IF 10.5 1区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
Ming Zhou;Jie Lu
{"title":"基于连续图学习的多流概念漂移自适应算法","authors":"Ming Zhou;Jie Lu","doi":"10.1109/TCYB.2025.3569816","DOIUrl":null,"url":null,"abstract":"Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.","PeriodicalId":13112,"journal":{"name":"IEEE Transactions on Cybernetics","volume":"55 8","pages":"3760-3773"},"PeriodicalIF":10.5000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Continuous Graph Learning-Based Self-Adaptation for Multi-Stream Concept Drift\",\"authors\":\"Ming Zhou;Jie Lu\",\"doi\":\"10.1109/TCYB.2025.3569816\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.\",\"PeriodicalId\":13112,\"journal\":{\"name\":\"IEEE Transactions on Cybernetics\",\"volume\":\"55 8\",\"pages\":\"3760-3773\"},\"PeriodicalIF\":10.5000,\"publicationDate\":\"2025-06-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Cybernetics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11028064/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Cybernetics","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11028064/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

在非平稳数据流环境中,以数据分布随时间变化为特征的概念漂移一直是一个不可避免的问题。由于流间相关性的潜在改变,多流场景特别复杂,这对解决跨多流的概念漂移提出了重大挑战。现有的自适应方法大多针对单流数据,对多流数据的研究较少。为了解决这些差距,我们提出了一个基于连续图学习的多流概念漂移自适应框架,称为CGLM。该框架引入了一种嵌入动态图生成器(AGG)的新型图神经网络(GNN)结构。该生成器使用小规模历史数据创建自适应关联图,在训练阶段捕获流之间的时空依赖关系,而无需预定义图。然后初始化基本预测GNN模型。当在线测试开始时,实时性能被监控以检测概念漂移。通过子图更新实现自适应过程,在非漂移和漂移场景下采用不同的连续图学习机制。子图的轻量级调整是在非漂移的情况下进行的。当发生漂移时,AGG根据新到达的样本生成一个新的动态图。我们的自适应扩散图注意模块(ADGAT)在新生成的动态图中捕获漂移引起的局部相关变化。它根据漂移的程度自适应地更新原始相关图的权值。在三个大规模真实数据集上的实验结果表明,我们的方法优于所有基线方法。此外,当大规模数据可用于训练时,我们提出的CGLM仍然优于基线方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Continuous Graph Learning-Based Self-Adaptation for Multi-Stream Concept Drift
Concept drift, characterized by changes in data distribution over time, has always been an inevitable problem in nonstationary data stream environments. Multistream scenarios are particularly complex due to the potential alteration of interstream correlations, posing significant challenges in addressing concept drift across multiple streams. Most existing adaptation methods target single-stream data, with limited research on multistream. To address these gaps, we propose a Continuous Graph Learning-based self-adaptation framework for Multistream concept drift, termed as CGLM. Our framework introduces a novel graph neural network (GNN) structure embedded with a dynamic graph generator (AGG). This generator creates an adaptive correlation graph using small-scale historical data, capturing spatio-temporal dependencies among streams without predefined graphs during the training phase. A base prediction GNN model is then initialized. When online testing starts, real-time performance is monitored to detect concept drift. Self-adaptation process is achieved by subgraph updating, with different continuous graph learning mechanisms are applied to nondrift or drift scenarios. Lightweight adjustment of subgraphs is performed under nondrift. When drift occurs, AGG generates a new dynamic graph based on newly arriving samples. Our adaptive diffusion graph attention module (ADGAT) captures local correlation changes caused by the drift in the newly generated dynamic graph. It adaptively updates the weights of the original correlation graph based on the extent of the drift. Experimental results on three large-scale real-world datasets demonstrate the superiority of our method over all baseline methods. Additionally, when large-scale data is available for training, our proposed CGLM still surpasses baseline methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE Transactions on Cybernetics
IEEE Transactions on Cybernetics COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
25.40
自引率
11.00%
发文量
1869
期刊介绍: The scope of the IEEE Transactions on Cybernetics includes computational approaches to the field of cybernetics. Specifically, the transactions welcomes papers on communication and control across machines or machine, human, and organizations. The scope includes such areas as computational intelligence, computer vision, neural networks, genetic algorithms, machine learning, fuzzy systems, cognitive systems, decision making, and robotics, to the extent that they contribute to the theme of cybernetics or demonstrate an application of cybernetics principles.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信