SGB-Net:可扩展的图形广域网

IF 8.9 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Yuebin Xu;C. L. Philip Chen;Mengqi Wu;Tong Zhang
{"title":"SGB-Net:可扩展的图形广域网","authors":"Yuebin Xu;C. L. Philip Chen;Mengqi Wu;Tong Zhang","doi":"10.1109/TNNLS.2025.3552129","DOIUrl":null,"url":null,"abstract":"Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"17019-17033"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SGB-Net: Scalable Graph Broad Network\",\"authors\":\"Yuebin Xu;C. L. Philip Chen;Mengqi Wu;Tong Zhang\",\"doi\":\"10.1109/TNNLS.2025.3552129\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"17019-17033\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10962448/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10962448/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

由于现实中图数据的复杂性和自进化性,图学习方法既需要表示非结构化数据的有效性,又需要适应不断发展的图的可扩展性。然而,由于结构深度的瓶颈,目前的研究在可优化的图特征空间上存在表征学习的局限性。此外,当图演变时,他们遇到了一个完整的再训练过程,特别是在没有新标签帮助的情况下。为了解决上述问题,我们提出了一个可扩展的图形宽带网络(SGB-Net),它包含三个提议的模块:用于增强图形嵌入的图形特征广泛转换层(GFBT层)和两个用于赋予可扩展性的更新算法(SGB-Net- u, SGB-Net- s)。GFBT层旨在显式扩展图特征空间,广泛地构建模型。在不同的图尺度上构造两个可扩展的特征空间,实现图的判别嵌入。SGB-Net-U是一种探索性方法,旨在通过利用无监督增量知识扩展图表示来解决无标签图增量学习(GIL)问题。SGB-Net-S在涉及标签的经典增量学习场景中具有可伸缩性。得益于其广泛的构建框架,SGB-Net不仅增强了图嵌入,而且在无需再培训的情况下无缝地适应并提高了图扩展的性能。在15个基准数据集上进行的实验中,SGB-Net在有效性和可扩展性方面都优于最先进的gnn。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SGB-Net: Scalable Graph Broad Network
Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信