通过混合子结构学习提高图神经网络的表示能力

IF 15.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Zhenpeng Wu , Jiamin Chen , Jianliang Gao
{"title":"通过混合子结构学习提高图神经网络的表示能力","authors":"Zhenpeng Wu ,&nbsp;Jiamin Chen ,&nbsp;Jianliang Gao","doi":"10.1016/j.inffus.2025.103558","DOIUrl":null,"url":null,"abstract":"<div><div>The recent trend in graph representation learning is to use Graph Neural Networks (GNNs) to approximate specific functions to capture specific graph substructures when performing aggregation, achieving stronger representational power than the 1-dimensional Weisfeiler-Leman (1-WL) graph isomorphism test. However, different graph substructures have different contributions in various scenarios, such as the clique substructure for social networks. Moreover, these methods suffer from high computational costs in capturing graph substructures, making it impractical to directly count all graph substructures when performing aggregation. Therefore, adapting the optimal graph substructure for different scenarios is an obvious challenge. To address the above challenge, we propose a simple yet effective solution, MixSL, which is flexible enough to work with any GNN backbone. Based on theoretical analysis, we offer a straightforward strategy that restricts the information of all graph substructures to the input feature space in advance, rather than the aggregation process, thereby significantly reducing computational costs. Then, we apply mixed substructure learning to all graph substructures, so that the GNN backbone can automatically learn the sample distribution of graph substructures. Without changing the GNN backbone architecture and training settings, MixSL brings a consistent and significant performance improvement on multiple graph classification benchmarks from different scenarios.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"126 ","pages":"Article 103558"},"PeriodicalIF":15.5000,"publicationDate":"2025-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Improving the representational power of graph neural networks via mixed substructure learning\",\"authors\":\"Zhenpeng Wu ,&nbsp;Jiamin Chen ,&nbsp;Jianliang Gao\",\"doi\":\"10.1016/j.inffus.2025.103558\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The recent trend in graph representation learning is to use Graph Neural Networks (GNNs) to approximate specific functions to capture specific graph substructures when performing aggregation, achieving stronger representational power than the 1-dimensional Weisfeiler-Leman (1-WL) graph isomorphism test. However, different graph substructures have different contributions in various scenarios, such as the clique substructure for social networks. Moreover, these methods suffer from high computational costs in capturing graph substructures, making it impractical to directly count all graph substructures when performing aggregation. Therefore, adapting the optimal graph substructure for different scenarios is an obvious challenge. To address the above challenge, we propose a simple yet effective solution, MixSL, which is flexible enough to work with any GNN backbone. Based on theoretical analysis, we offer a straightforward strategy that restricts the information of all graph substructures to the input feature space in advance, rather than the aggregation process, thereby significantly reducing computational costs. Then, we apply mixed substructure learning to all graph substructures, so that the GNN backbone can automatically learn the sample distribution of graph substructures. Without changing the GNN backbone architecture and training settings, MixSL brings a consistent and significant performance improvement on multiple graph classification benchmarks from different scenarios.</div></div>\",\"PeriodicalId\":50367,\"journal\":{\"name\":\"Information Fusion\",\"volume\":\"126 \",\"pages\":\"Article 103558\"},\"PeriodicalIF\":15.5000,\"publicationDate\":\"2025-07-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Information Fusion\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S156625352500630X\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S156625352500630X","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

图表示学习的最新趋势是在进行聚合时使用图神经网络(gnn)近似特定函数来捕获特定的图子结构,从而获得比一维Weisfeiler-Leman (1-WL)图同构检验更强的表示能力。然而,不同的图子结构在不同的场景中有不同的贡献,例如社交网络中的派系子结构。此外,这些方法在捕获图子结构时的计算成本很高,使得在执行聚合时直接计算所有图子结构是不切实际的。因此,适应不同场景的最优图子结构是一个明显的挑战。为了应对上述挑战,我们提出了一个简单而有效的解决方案MixSL,它足够灵活,可以与任何GNN骨干网一起工作。在理论分析的基础上,我们提出了一种直接的策略,将所有图子结构的信息预先限制在输入特征空间中,而不是聚合过程中,从而大大降低了计算成本。然后,我们将混合子结构学习应用到所有的图子结构中,使GNN主干能够自动学习图子结构的样本分布。在不改变GNN主干架构和训练设置的情况下,MixSL在来自不同场景的多个图分类基准测试中带来了一致且显著的性能改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Improving the representational power of graph neural networks via mixed substructure learning
The recent trend in graph representation learning is to use Graph Neural Networks (GNNs) to approximate specific functions to capture specific graph substructures when performing aggregation, achieving stronger representational power than the 1-dimensional Weisfeiler-Leman (1-WL) graph isomorphism test. However, different graph substructures have different contributions in various scenarios, such as the clique substructure for social networks. Moreover, these methods suffer from high computational costs in capturing graph substructures, making it impractical to directly count all graph substructures when performing aggregation. Therefore, adapting the optimal graph substructure for different scenarios is an obvious challenge. To address the above challenge, we propose a simple yet effective solution, MixSL, which is flexible enough to work with any GNN backbone. Based on theoretical analysis, we offer a straightforward strategy that restricts the information of all graph substructures to the input feature space in advance, rather than the aggregation process, thereby significantly reducing computational costs. Then, we apply mixed substructure learning to all graph substructures, so that the GNN backbone can automatically learn the sample distribution of graph substructures. Without changing the GNN backbone architecture and training settings, MixSL brings a consistent and significant performance improvement on multiple graph classification benchmarks from different scenarios.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Information Fusion
Information Fusion 工程技术-计算机:理论方法
CiteScore
33.20
自引率
4.30%
发文量
161
审稿时长
7.9 months
期刊介绍: Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信