DCMSL: Dual influenced community strength-boosted multi-scale graph contrastive learning

IF 7.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
{"title":"DCMSL: Dual influenced community strength-boosted multi-scale graph contrastive learning","authors":"","doi":"10.1016/j.knosys.2024.112472","DOIUrl":null,"url":null,"abstract":"<div><p>Graph Contrastive Learning (GCL) effectively mitigates label dependency, defining positive and negative pairs for node embeddings. Nevertheless, most GCL methods, including those considering communities, overlooking the simultaneous influence of community and node—a crucial factor for accurate embeddings. In this paper, we propose <strong>D</strong>ual influenced <strong>C</strong>ommunity Strength-boosted <strong>M</strong>ulti-<strong>S</strong>cale Graph Contrastive <strong>L</strong>earning (DCMSL), concurrently considering community and node influence for comprehensive contrastive learning. Firstly, we define dual influenced community strength which can be adaptable to diverse datasets. Based on it, we define node cruciality to differentiate node importance. Secondly, two graph data augmentation methods, NCNAM and NCED, respectively, are put forward based on node cruciality, guiding graph augmentation to preserve more influential semantic information. Thirdly, a joint multi-scale graph contrastive scheme is raised to guide the graph encoder to learn data semantic information at two scales: (1) Propulsive force node-level graph contrastive learning—a node-level graph contrastive loss defining the force to push negative pairs in GCL farther away. (2) Community-level graph contrastive learning—enabling the graph encoder to learn from data on the community level, improving model performance. DCMSL achieves state-of-the-art results, demonstrating its effectiveness and versatility in two node-level tasks: node classification and node clustering and one edge-level task: link prediction. Our code is available at: <span><span>https://github.com/HanChen-HUST/DCMSL</span><svg><path></path></svg></span>.</p></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":null,"pages":null},"PeriodicalIF":7.2000,"publicationDate":"2024-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705124011067","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Graph Contrastive Learning (GCL) effectively mitigates label dependency, defining positive and negative pairs for node embeddings. Nevertheless, most GCL methods, including those considering communities, overlooking the simultaneous influence of community and node—a crucial factor for accurate embeddings. In this paper, we propose Dual influenced Community Strength-boosted Multi-Scale Graph Contrastive Learning (DCMSL), concurrently considering community and node influence for comprehensive contrastive learning. Firstly, we define dual influenced community strength which can be adaptable to diverse datasets. Based on it, we define node cruciality to differentiate node importance. Secondly, two graph data augmentation methods, NCNAM and NCED, respectively, are put forward based on node cruciality, guiding graph augmentation to preserve more influential semantic information. Thirdly, a joint multi-scale graph contrastive scheme is raised to guide the graph encoder to learn data semantic information at two scales: (1) Propulsive force node-level graph contrastive learning—a node-level graph contrastive loss defining the force to push negative pairs in GCL farther away. (2) Community-level graph contrastive learning—enabling the graph encoder to learn from data on the community level, improving model performance. DCMSL achieves state-of-the-art results, demonstrating its effectiveness and versatility in two node-level tasks: node classification and node clustering and one edge-level task: link prediction. Our code is available at: https://github.com/HanChen-HUST/DCMSL.

Abstract Image

DCMSL:双重影响社区强度增强多尺度图对比学习
图对比学习(GCL)能有效缓解标签依赖性,为节点嵌入定义正负对。然而,大多数 GCL 方法,包括那些考虑社群的方法,都忽略了社群和节点的同时影响--这是准确嵌入的关键因素。在本文中,我们提出了双重影响社区强度增强多尺度图对比学习(DCMSL),同时考虑社区和节点的影响,以实现全面的对比学习。首先,我们定义了可适应不同数据集的双重影响社区强度。在此基础上,我们定义了节点关键度来区分节点的重要性。其次,在节点关键度的基础上分别提出了 NCNAM 和 NCED 两种图数据扩增方法,指导图扩增保留更多有影响力的语义信息。第三,提出多尺度图对比联合方案,引导图编码器在两个尺度上学习数据语义信息:(1)推进力节点级图对比学习--节点级图对比损失定义了将 GCL 中负对推远的力。(2) 群落级图形对比学习--使图形编码器能够从群落级数据中学习,从而提高模型性能。DCMSL 取得了最先进的成果,在两个节点级任务(节点分类和节点聚类)和一个边缘级任务(链接预测)中展示了其有效性和多功能性。我们的代码可在以下网址获取:https://github.com/HanChen-HUST/DCMSL。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信