程度问题:评估图神经网络的泛化

Hyunmok Park, Kijung Yoon
{"title":"程度问题:评估图神经网络的泛化","authors":"Hyunmok Park, Kijung Yoon","doi":"10.1109/IC-NIDC54101.2021.9660574","DOIUrl":null,"url":null,"abstract":"Graph neural network (GNN) is a general framework for using deep neural networks on graph data. The defining feature of a GNN is that it uses a form of neural message passing where vector messages are exchanged between nodes and updated using neural networks. The message passing operation that underlies GNNs has recently been applied to develop neural approximate inference algorithms, but little work has been done on understanding under what conditions GNNs can be used as a core module for building general inference models. To study this question, we consider the task of out-of-distribution generalization where training and test data have different distributions, by systematically investigating how the graph size and structural properties affect the inferential performance of GNNs. We find that (1) the average unique node degree is one of the key features in predicting whether GNNs can generalize to unseen graphs; (2) the graph size is not a fundamental limiting factor of the generalization in GNNs when the average node degree remains invariant across training and test distributions; (3) despite the size-invariant generalization, training GNNs on graphs of high degree (and of large size consequently) is not trivial (4) neural inference by GNNs outperforms algorithmic inferences especially when the pairwise potentials are strong, which naturally makes the inference problem harder.","PeriodicalId":264468,"journal":{"name":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Degree Matters: Assessing the Generalization of Graph Neural Network\",\"authors\":\"Hyunmok Park, Kijung Yoon\",\"doi\":\"10.1109/IC-NIDC54101.2021.9660574\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph neural network (GNN) is a general framework for using deep neural networks on graph data. The defining feature of a GNN is that it uses a form of neural message passing where vector messages are exchanged between nodes and updated using neural networks. The message passing operation that underlies GNNs has recently been applied to develop neural approximate inference algorithms, but little work has been done on understanding under what conditions GNNs can be used as a core module for building general inference models. To study this question, we consider the task of out-of-distribution generalization where training and test data have different distributions, by systematically investigating how the graph size and structural properties affect the inferential performance of GNNs. We find that (1) the average unique node degree is one of the key features in predicting whether GNNs can generalize to unseen graphs; (2) the graph size is not a fundamental limiting factor of the generalization in GNNs when the average node degree remains invariant across training and test distributions; (3) despite the size-invariant generalization, training GNNs on graphs of high degree (and of large size consequently) is not trivial (4) neural inference by GNNs outperforms algorithmic inferences especially when the pairwise potentials are strong, which naturally makes the inference problem harder.\",\"PeriodicalId\":264468,\"journal\":{\"name\":\"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)\",\"volume\":\"63 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-11-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IC-NIDC54101.2021.9660574\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 7th IEEE International Conference on Network Intelligence and Digital Content (IC-NIDC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IC-NIDC54101.2021.9660574","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

图神经网络(GNN)是在图数据上使用深度神经网络的通用框架。GNN的定义特征是它使用一种神经消息传递形式,其中向量消息在节点之间交换并使用神经网络更新。作为gnn基础的消息传递操作最近被应用于开发神经近似推理算法,但在理解gnn在什么条件下可以用作构建一般推理模型的核心模块方面做的工作很少。为了研究这个问题,我们考虑了分布外泛化的任务,其中训练数据和测试数据具有不同的分布,通过系统地研究图的大小和结构性质如何影响gnn的推理性能。我们发现(1)平均唯一节点度是预测gnn能否泛化到未见图的关键特征之一;(2)当平均节点度在训练分布和测试分布之间保持不变时,图大小不是gnn泛化的基本限制因素;(3)尽管有大小不变的泛化,但在高程度(因此也有大尺寸)的图上训练gnn并非易事(4)gnn的神经推理优于算法推理,特别是当配对电位很强时,这自然使推理问题变得更加困难。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Degree Matters: Assessing the Generalization of Graph Neural Network
Graph neural network (GNN) is a general framework for using deep neural networks on graph data. The defining feature of a GNN is that it uses a form of neural message passing where vector messages are exchanged between nodes and updated using neural networks. The message passing operation that underlies GNNs has recently been applied to develop neural approximate inference algorithms, but little work has been done on understanding under what conditions GNNs can be used as a core module for building general inference models. To study this question, we consider the task of out-of-distribution generalization where training and test data have different distributions, by systematically investigating how the graph size and structural properties affect the inferential performance of GNNs. We find that (1) the average unique node degree is one of the key features in predicting whether GNNs can generalize to unseen graphs; (2) the graph size is not a fundamental limiting factor of the generalization in GNNs when the average node degree remains invariant across training and test distributions; (3) despite the size-invariant generalization, training GNNs on graphs of high degree (and of large size consequently) is not trivial (4) neural inference by GNNs outperforms algorithmic inferences especially when the pairwise potentials are strong, which naturally makes the inference problem harder.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信