FastHGNN:超图神经网络学习的新取样技术

IF 4 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Fengcheng Lu, Michael Kwok-Po Ng
{"title":"FastHGNN:超图神经网络学习的新取样技术","authors":"Fengcheng Lu, Michael Kwok-Po Ng","doi":"10.1145/3663670","DOIUrl":null,"url":null,"abstract":"<p>Hypergraphs can represent higher-order relations among objects. Traditional hypergraph neural networks involve node-edge-node transform, leading to high computational cost and timing. The main aim of this paper is to propose a new sampling technique for learning with hypergraph neural networks. The core idea is to design a layer-wise sampling scheme for nodes and hyperedges to approximate the original hypergraph convolution. We rewrite hypergraph convolution in the form of double integral and leverage Monte Carlo to achieve a discrete and consistent estimator. In addition, we use importance sampling and finally derive feasible probability mass functions for both nodes and hyperedges in consideration of variance reduction, based on some assumptions. Notably, the proposed sampling technique allows us to handle large-scale hypergraph learning, which is not feasible with traditional hypergraph neural networks. Experiment results demonstrate that our proposed model keeps a good balance between running time and prediction accuracy.</p>","PeriodicalId":49249,"journal":{"name":"ACM Transactions on Knowledge Discovery from Data","volume":"76 1","pages":""},"PeriodicalIF":4.0000,"publicationDate":"2024-05-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"FastHGNN: A New Sampling Technique for Learning with Hypergraph Neural Networks\",\"authors\":\"Fengcheng Lu, Michael Kwok-Po Ng\",\"doi\":\"10.1145/3663670\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Hypergraphs can represent higher-order relations among objects. Traditional hypergraph neural networks involve node-edge-node transform, leading to high computational cost and timing. The main aim of this paper is to propose a new sampling technique for learning with hypergraph neural networks. The core idea is to design a layer-wise sampling scheme for nodes and hyperedges to approximate the original hypergraph convolution. We rewrite hypergraph convolution in the form of double integral and leverage Monte Carlo to achieve a discrete and consistent estimator. In addition, we use importance sampling and finally derive feasible probability mass functions for both nodes and hyperedges in consideration of variance reduction, based on some assumptions. Notably, the proposed sampling technique allows us to handle large-scale hypergraph learning, which is not feasible with traditional hypergraph neural networks. Experiment results demonstrate that our proposed model keeps a good balance between running time and prediction accuracy.</p>\",\"PeriodicalId\":49249,\"journal\":{\"name\":\"ACM Transactions on Knowledge Discovery from Data\",\"volume\":\"76 1\",\"pages\":\"\"},\"PeriodicalIF\":4.0000,\"publicationDate\":\"2024-05-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Knowledge Discovery from Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/3663670\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Knowledge Discovery from Data","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3663670","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

超图可以表示对象之间的高阶关系。传统的超图神经网络涉及节点-边-节点转换,导致计算成本和时间成本较高。本文的主要目的是为超图神经网络的学习提出一种新的采样技术。其核心思想是设计一种节点和超边缘的层向采样方案,以逼近原始的超图卷积。我们以双积分的形式重写超图卷积,并利用蒙特卡罗来实现离散和一致的估计。此外,我们还使用了重要度抽样,并在考虑到降低方差的前提下,基于一些假设,最终为节点和超边缘推导出可行的概率质量函数。值得注意的是,我们提出的采样技术允许我们处理大规模超图学习,而这在传统的超图神经网络中是不可行的。实验结果表明,我们提出的模型在运行时间和预测精度之间保持了良好的平衡。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
FastHGNN: A New Sampling Technique for Learning with Hypergraph Neural Networks

Hypergraphs can represent higher-order relations among objects. Traditional hypergraph neural networks involve node-edge-node transform, leading to high computational cost and timing. The main aim of this paper is to propose a new sampling technique for learning with hypergraph neural networks. The core idea is to design a layer-wise sampling scheme for nodes and hyperedges to approximate the original hypergraph convolution. We rewrite hypergraph convolution in the form of double integral and leverage Monte Carlo to achieve a discrete and consistent estimator. In addition, we use importance sampling and finally derive feasible probability mass functions for both nodes and hyperedges in consideration of variance reduction, based on some assumptions. Notably, the proposed sampling technique allows us to handle large-scale hypergraph learning, which is not feasible with traditional hypergraph neural networks. Experiment results demonstrate that our proposed model keeps a good balance between running time and prediction accuracy.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACM Transactions on Knowledge Discovery from Data
ACM Transactions on Knowledge Discovery from Data COMPUTER SCIENCE, INFORMATION SYSTEMS-COMPUTER SCIENCE, SOFTWARE ENGINEERING
CiteScore
6.70
自引率
5.60%
发文量
172
审稿时长
3 months
期刊介绍: TKDD welcomes papers on a full range of research in the knowledge discovery and analysis of diverse forms of data. Such subjects include, but are not limited to: scalable and effective algorithms for data mining and big data analysis, mining brain networks, mining data streams, mining multi-media data, mining high-dimensional data, mining text, Web, and semi-structured data, mining spatial and temporal data, data mining for community generation, social network analysis, and graph structured data, security and privacy issues in data mining, visual, interactive and online data mining, pre-processing and post-processing for data mining, robust and scalable statistical methods, data mining languages, foundations of data mining, KDD framework and process, and novel applications and infrastructures exploiting data mining technology including massively parallel processing and cloud computing platforms. TKDD encourages papers that explore the above subjects in the context of large distributed networks of computers, parallel or multiprocessing computers, or new data devices. TKDD also encourages papers that describe emerging data mining applications that cannot be satisfied by the current data mining technology.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信