Position encoding for heterogeneous graph neural networks

Xi Zeng, Qingyun Dai, Fangyu Lei
{"title":"Position encoding for heterogeneous graph neural networks","authors":"Xi Zeng, Qingyun Dai, Fangyu Lei","doi":"10.1117/12.2639209","DOIUrl":null,"url":null,"abstract":"Many real-world networks are suitable to be modeled as heterogeneous graphs, which are made up of many sorts of nodes and links. When the heterogeneous map is a non-attribute graph or some features on the graph are missing, it will lead to poor performance of the previous models. In this paper, we hold that useful position features can be generated through the guidance of topological information on the graph and present a generic framework for Heterogeneous Graph Neural Networks(HGNNs), termed Position Encoding(PE). First of all, PE leverages existing node embedding methods to obtain the implicit semantics on a graph and generate low-dimensional node embedding. Secondly, for each task-related target node, PE generates corresponding sampling subgraphs, in which we use node embedding to calculate the relative positions and encode the positions into position features that can be used directly or as an additional feature. Then the set of subgraphs with position features can be easily combined with the desired Graph Neural Networks (GNNs) or HGNNs to learn the representation of target nodes. We evaluated our method on graph classification tasks over three commonly used heterogeneous graph datasets with two processing ways, and experimental results show the superiority of PE over baselines.","PeriodicalId":336892,"journal":{"name":"Neural Networks, Information and Communication Engineering","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks, Information and Communication Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2639209","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Many real-world networks are suitable to be modeled as heterogeneous graphs, which are made up of many sorts of nodes and links. When the heterogeneous map is a non-attribute graph or some features on the graph are missing, it will lead to poor performance of the previous models. In this paper, we hold that useful position features can be generated through the guidance of topological information on the graph and present a generic framework for Heterogeneous Graph Neural Networks(HGNNs), termed Position Encoding(PE). First of all, PE leverages existing node embedding methods to obtain the implicit semantics on a graph and generate low-dimensional node embedding. Secondly, for each task-related target node, PE generates corresponding sampling subgraphs, in which we use node embedding to calculate the relative positions and encode the positions into position features that can be used directly or as an additional feature. Then the set of subgraphs with position features can be easily combined with the desired Graph Neural Networks (GNNs) or HGNNs to learn the representation of target nodes. We evaluated our method on graph classification tasks over three commonly used heterogeneous graph datasets with two processing ways, and experimental results show the superiority of PE over baselines.
异构图神经网络的位置编码
许多现实世界的网络适合建模为异构图,这些图由多种类型的节点和链接组成。当异构映射为非属性图或图上的某些特征缺失时,会导致之前的模型性能不佳。在本文中,我们认为可以通过图上的拓扑信息的引导生成有用的位置特征,并提出了异构图神经网络(hgnn)的通用框架,称为位置编码(PE)。首先,PE利用现有的节点嵌入方法获取图上的隐式语义,生成低维节点嵌入。其次,PE对每个与任务相关的目标节点生成相应的采样子图,在子图中使用节点嵌入计算相对位置,并将位置编码为可以直接使用或作为附加特征使用的位置特征。然后,具有位置特征的子图集可以很容易地与所需的图神经网络(gnn)或hgnn相结合,以学习目标节点的表示。我们在三种常用的异构图数据集上用两种处理方法对我们的方法进行了图分类任务的评估,实验结果表明PE优于基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信