Dynamic Graph Representation Based on Temporal and Contextual Contrasting

Wen-Yuan Zhu, Ke Ruan, Jin Huang, Jing Xiao, Weihao Yu
{"title":"Dynamic Graph Representation Based on Temporal and Contextual Contrasting","authors":"Wen-Yuan Zhu, Ke Ruan, Jin Huang, Jing Xiao, Weihao Yu","doi":"10.1145/3579654.3579771","DOIUrl":null,"url":null,"abstract":"Dynamic graph representation learning is critical for graph-based downstream tasks such as link prediction, node classification, and graph reconstruction. Many graph-neural-network-based methods have emerged recently, but most are incapable of tracing graph evolution patterns over time. To solve this problem, we propose a continuous-time dynamic graph framework: dynamic graph temporal contextual contrasting (DGTCC) model, which integrates temporal and topology information to capture the latent evolution trend of graph representation. In this model, the node representation is first generated by a self-attention–based temporal encoder, which measures the importance weights of neighbor nodes in temporal sub-graphs and stores them in the contextual memory module. After sampling the node representation from the memory module, the model maximizes the mutual information of the same node that occurred in two nearby temporal views by the contrastive learning mechanism, which helps track the evolutional trend of nodes. In inductive learning settings, the results on four real datasets demonstrate the advantages of the proposed DGTCC model.","PeriodicalId":146783,"journal":{"name":"Proceedings of the 2022 5th International Conference on Algorithms, Computing and Artificial Intelligence","volume":"185 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Algorithms, Computing and Artificial Intelligence","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3579654.3579771","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Dynamic graph representation learning is critical for graph-based downstream tasks such as link prediction, node classification, and graph reconstruction. Many graph-neural-network-based methods have emerged recently, but most are incapable of tracing graph evolution patterns over time. To solve this problem, we propose a continuous-time dynamic graph framework: dynamic graph temporal contextual contrasting (DGTCC) model, which integrates temporal and topology information to capture the latent evolution trend of graph representation. In this model, the node representation is first generated by a self-attention–based temporal encoder, which measures the importance weights of neighbor nodes in temporal sub-graphs and stores them in the contextual memory module. After sampling the node representation from the memory module, the model maximizes the mutual information of the same node that occurred in two nearby temporal views by the contrastive learning mechanism, which helps track the evolutional trend of nodes. In inductive learning settings, the results on four real datasets demonstrate the advantages of the proposed DGTCC model.
基于时间和上下文对比的动态图表示
动态图表示学习对于基于图的下游任务(如链接预测、节点分类和图重建)至关重要。最近出现了许多基于图神经网络的方法,但大多数方法都无法跟踪图随时间的演变模式。为了解决这一问题,我们提出了一种连续时间动态图框架:动态图时间上下文对比(DGTCC)模型,该模型集成了时间信息和拓扑信息,以捕捉图表示的潜在演变趋势。在该模型中,节点表示首先由基于自注意的时间编码器生成,该编码器测量时间子图中相邻节点的重要性权重,并将其存储在上下文记忆模块中。该模型从记忆模块中抽取节点表示后,通过对比学习机制最大化两个相邻时间视图中同一节点的互信息,有助于跟踪节点的演化趋势。在归纳学习设置中,四个真实数据集的结果证明了DGTCC模型的优势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信