{"title":"SGB-Net:可扩展的图形广域网","authors":"Yuebin Xu;C. L. Philip Chen;Mengqi Wu;Tong Zhang","doi":"10.1109/TNNLS.2025.3552129","DOIUrl":null,"url":null,"abstract":"Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"36 9","pages":"17019-17033"},"PeriodicalIF":8.9000,"publicationDate":"2025-04-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SGB-Net: Scalable Graph Broad Network\",\"authors\":\"Yuebin Xu;C. L. Philip Chen;Mengqi Wu;Tong Zhang\",\"doi\":\"10.1109/TNNLS.2025.3552129\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"36 9\",\"pages\":\"17019-17033\"},\"PeriodicalIF\":8.9000,\"publicationDate\":\"2025-04-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10962448/\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10962448/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
摘要
由于现实中图数据的复杂性和自进化性,图学习方法既需要表示非结构化数据的有效性,又需要适应不断发展的图的可扩展性。然而,由于结构深度的瓶颈,目前的研究在可优化的图特征空间上存在表征学习的局限性。此外,当图演变时,他们遇到了一个完整的再训练过程,特别是在没有新标签帮助的情况下。为了解决上述问题,我们提出了一个可扩展的图形宽带网络(SGB-Net),它包含三个提议的模块:用于增强图形嵌入的图形特征广泛转换层(GFBT层)和两个用于赋予可扩展性的更新算法(SGB-Net- u, SGB-Net- s)。GFBT层旨在显式扩展图特征空间,广泛地构建模型。在不同的图尺度上构造两个可扩展的特征空间,实现图的判别嵌入。SGB-Net-U是一种探索性方法,旨在通过利用无监督增量知识扩展图表示来解决无标签图增量学习(GIL)问题。SGB-Net-S在涉及标签的经典增量学习场景中具有可伸缩性。得益于其广泛的构建框架,SGB-Net不仅增强了图嵌入,而且在无需再培训的情况下无缝地适应并提高了图扩展的性能。在15个基准数据集上进行的实验中,SGB-Net在有效性和可扩展性方面都优于最先进的gnn。
Due to the complexity and self-evolutionary property of graph data in reality, graph learning methods require both validity to represent unstructured data and scalability to adapt to evolving graphs. However, current works have representation learning limitations on optimizable graph feature space due to the bottleneck of the structure depth. Moreover, they encounter a complete retraining process when graphs evolve, especially in the case without the assistance of new labels. To address the above issues, we propose a scalable graph broad network (SGB-Net), which contains three proposed modules: the graph feature broad transformation layer (GFBT layer) for enhancing graph embedding and two update algorithms (SGB-Net-U, SGB-Net-S) for endowing scalability. The GFBT layer aims to explicitly expand the graph feature space and broadly build the model. It constructs two expandable feature spaces in various graph scales to embed graphs discriminatively. SGB-Net-U is an exploratory method designed to tackle the label-free graph incremental learning (GIL) problem by leveraging unsupervised incremental knowledge to expand graph representation. SGB-Net-S endows scalability in classical incremental learning scenarios involving labels. Benefiting from its broad construction framework, SGB-Net not only enhances graph embeddings but also seamlessly adapts and improves performance in response to graph expansion without requiring retraining. In the experiments conducted on 15 benchmark datasets, SGB-Net outperforms state-of-the-art GNNs in terms of both effectiveness and scalability.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.