Edgeless-GNN: Unsupervised Representation Learning for Edgeless Nodes

IF 5.1 2区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Yong-Min Shin;Cong Tran;Won-Yong Shin;Xin Cao
{"title":"Edgeless-GNN: Unsupervised Representation Learning for Edgeless Nodes","authors":"Yong-Min Shin;Cong Tran;Won-Yong Shin;Xin Cao","doi":"10.1109/TETC.2023.3292240","DOIUrl":null,"url":null,"abstract":"We study the problem of embedding \n<i>edgeless</i>\n nodes such as users who newly enter the underlying network, while using graph neural networks (GNNs) widely studied for effective representation learning of graphs. Our study is motivated by the fact that GNNs cannot be straightforwardly adopted for our problem since message passing to such edgeless nodes having no connections is impossible. To tackle this challenge, we propose \n<inline-formula><tex-math>$\\mathsf{Edgeless-GNN}$</tex-math></inline-formula>\n, a novel inductive framework that enables GNNs to generate node embeddings even for edgeless nodes through \n<i>unsupervised learning</i>\n. Specifically, we start by constructing a proxy graph based on the similarity of node attributes as the GNN's computation graph defined by the underlying network. The known network structure is used to train model parameters, whereas a \n<i>topology-aware</i>\n loss function is established such that our model judiciously learns the network structure by encoding positive, negative, and second-order relations between nodes. For the edgeless nodes, we \n<i>inductively</i>\n infer embeddings by expanding the computation graph. By evaluating the performance of various downstream machine learning tasks, we empirically demonstrate that \n<inline-formula><tex-math>$\\mathsf{Edgeless-GNN}$</tex-math></inline-formula>\n exhibits (a) superiority over state-of-the-art inductive network embedding methods for edgeless nodes, (b) effectiveness of our topology-aware loss function, (c) robustness to incomplete node attributes, and (d) a linear scaling with the graph size.","PeriodicalId":13156,"journal":{"name":"IEEE Transactions on Emerging Topics in Computing","volume":"12 1","pages":"150-162"},"PeriodicalIF":5.1000,"publicationDate":"2023-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Emerging Topics in Computing","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10179257/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

We study the problem of embedding edgeless nodes such as users who newly enter the underlying network, while using graph neural networks (GNNs) widely studied for effective representation learning of graphs. Our study is motivated by the fact that GNNs cannot be straightforwardly adopted for our problem since message passing to such edgeless nodes having no connections is impossible. To tackle this challenge, we propose $\mathsf{Edgeless-GNN}$ , a novel inductive framework that enables GNNs to generate node embeddings even for edgeless nodes through unsupervised learning . Specifically, we start by constructing a proxy graph based on the similarity of node attributes as the GNN's computation graph defined by the underlying network. The known network structure is used to train model parameters, whereas a topology-aware loss function is established such that our model judiciously learns the network structure by encoding positive, negative, and second-order relations between nodes. For the edgeless nodes, we inductively infer embeddings by expanding the computation graph. By evaluating the performance of various downstream machine learning tasks, we empirically demonstrate that $\mathsf{Edgeless-GNN}$ exhibits (a) superiority over state-of-the-art inductive network embedding methods for edgeless nodes, (b) effectiveness of our topology-aware loss function, (c) robustness to incomplete node attributes, and (d) a linear scaling with the graph size.
无边GNN:无边节点的无监督表示学习
我们研究了嵌入无边缘节点的问题,例如新进入底层网络的用户,同时使用广泛研究的图神经网络(GNN)进行图的有效表示学习。我们的研究是基于这样一个事实,即GNN不能直接用于我们的问题,因为消息传递到这种没有连接的无边缘节点是不可能的。为了应对这一挑战,我们提出了无边GNN,这是一种新的归纳框架,使GNN能够通过无监督学习生成节点嵌入,即使是无边节点。具体来说,我们首先基于节点属性的相似性构建一个代理图,作为底层网络定义的GNN计算图。已知的网络结构用于训练模型参数,而拓扑感知损失函数是以这样一种方式建立的,即我们的模型通过编码节点之间的正、负和二阶关系来明智地学习网络结构。对于无边节点,我们通过扩展计算图来归纳推断嵌入。通过评估各种下游机器学习任务的性能,我们从经验上证明,无边GNN表现出(a)对于无边节点而言,优于最先进的归纳网络嵌入方法,(b)我们的拓扑感知损失函数的有效性,(c)对不完全节点属性的鲁棒性,以及(d)随图大小的线性缩放。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Emerging Topics in Computing
IEEE Transactions on Emerging Topics in Computing Computer Science-Computer Science (miscellaneous)
CiteScore
12.10
自引率
5.10%
发文量
113
期刊介绍: IEEE Transactions on Emerging Topics in Computing publishes papers on emerging aspects of computer science, computing technology, and computing applications not currently covered by other IEEE Computer Society Transactions. Some examples of emerging topics in computing include: IT for Green, Synthetic and organic computing structures and systems, Advanced analytics, Social/occupational computing, Location-based/client computer systems, Morphic computer design, Electronic game systems, & Health-care IT.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信