Relation-aware Graph Contrastive Learning

Bingshi Li, Jin Li, Yanglan Fu
{"title":"Relation-aware Graph Contrastive Learning","authors":"Bingshi Li, Jin Li, Yanglan Fu","doi":"10.1142/s0129626423400078","DOIUrl":null,"url":null,"abstract":"Over the past few years, graph contrastive learning (GCL) has gained great success in processing unlabeled graph-structured data, but most of the existing GCL methods are based on instance discrimination task which typically learns representations by minimizing the distance between two versions of the same instance. However, different from images, which are assumed to be independently and identically distributed, graphs present relational information among data instances, in which each instance is related to others by links. Furthermore, the relations are heterogeneous in many cases. The instance discrimination task cannot make full use of the relational information inherent in the graph-structured data. To solve the above-mentioned problems, this paper proposes a relation-aware graph contrastive learning method, called RGCL. Aiming to capture the most important heterogeneous relations in the graph, RGCL explicitly models the edges, and then pulls semantically similar pairs of edges together and pushes dissimilar ones apart with contrastive regularization. By exploiting the full potential of the relationship among nodes, RGCL overcomes the limitations of previous GCL methods based on instance discrimination. The experimental results demonstrate that the proposed method outperforms a series of graph contrastive learning frameworks on widely used benchmarks, which justifies the effectiveness of our work.","PeriodicalId":422436,"journal":{"name":"Parallel Process. Lett.","volume":"2006 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Parallel Process. Lett.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0129626423400078","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Over the past few years, graph contrastive learning (GCL) has gained great success in processing unlabeled graph-structured data, but most of the existing GCL methods are based on instance discrimination task which typically learns representations by minimizing the distance between two versions of the same instance. However, different from images, which are assumed to be independently and identically distributed, graphs present relational information among data instances, in which each instance is related to others by links. Furthermore, the relations are heterogeneous in many cases. The instance discrimination task cannot make full use of the relational information inherent in the graph-structured data. To solve the above-mentioned problems, this paper proposes a relation-aware graph contrastive learning method, called RGCL. Aiming to capture the most important heterogeneous relations in the graph, RGCL explicitly models the edges, and then pulls semantically similar pairs of edges together and pushes dissimilar ones apart with contrastive regularization. By exploiting the full potential of the relationship among nodes, RGCL overcomes the limitations of previous GCL methods based on instance discrimination. The experimental results demonstrate that the proposed method outperforms a series of graph contrastive learning frameworks on widely used benchmarks, which justifies the effectiveness of our work.
关系感知图对比学习
近年来,图对比学习(GCL)在处理未标记的图结构数据方面取得了巨大的成功,但现有的GCL方法大多是基于实例识别任务,通常通过最小化同一实例的两个版本之间的距离来学习表征。然而,与假设图像是独立和相同分布的不同,图表示数据实例之间的关系信息,其中每个实例都通过链接与其他实例相关联。此外,在许多情况下,这些关系是异构的。实例识别任务不能充分利用图结构数据中固有的关系信息。为了解决上述问题,本文提出了一种关系感知的图对比学习方法RGCL。为了捕获图中最重要的异构关系,RGCL显式地对边缘进行建模,然后将语义相似的边缘拉到一起,并通过对比正则化将不相似的边缘分开。通过充分利用节点间关系的潜力,RGCL克服了以前基于实例识别的GCL方法的局限性。实验结果表明,该方法在广泛使用的基准上优于一系列图对比学习框架,证明了我们工作的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信