用广义优化框架统一图神经网络

IF 5.4 2区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Chuan Shi, Meiqi Zhu, Yue Yu, Xiao Wang, Junping Du
{"title":"用广义优化框架统一图神经网络","authors":"Chuan Shi, Meiqi Zhu, Yue Yu, Xiao Wang, Junping Du","doi":"10.1145/3660852","DOIUrl":null,"url":null,"abstract":"Graph Neural Networks (GNNs) have received considerable attention on graph-structured data learning for a wide variety of tasks. The well-designed propagation mechanism, which has been demonstrated effective, is the most fundamental part of GNNs. Although most of the GNNs basically follow a message passing manner, little effort has been made to discover and analyze their essential relations. In this paper, we establish a surprising connection between different propagation mechanisms with an optimization problem. We show that despite the proliferation of various GNNs, in fact, their proposed propagation mechanisms are the optimal solutions of a generalized optimization framework with a flexible feature fitting function and a generalized graph regularization term. Actually, the optimization framework can not only help understand the propagation mechanisms of GNNs, but also open up opportunities for flexibly designing new GNNs. Through analyzing the general solutions of the optimization framework, we provide a more convenient way for deriving corresponding propagation results of GNNs. We further discover that existing works usually utilize naïve graph convolutional kernels for feature fitting function, or just utilize one-hop structural information (original topology graph) for graph regularization term. Correspondingly, we develop two novel objective functions considering adjustable graph kernels showing low-pass or high-pass filtering capabilities and one novel objective function considering high-order structural information during propagation respectively. Extensive experiments on benchmark datasets clearly show that the newly proposed GNNs not only outperform the state-of-the-art methods but also have good ability to alleviate over-smoothing, and further verify the feasibility for designing GNNs with the generalized unified optimization framework.","PeriodicalId":50936,"journal":{"name":"ACM Transactions on Information Systems","volume":null,"pages":null},"PeriodicalIF":5.4000,"publicationDate":"2024-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Unifying Graph Neural Networks with a Generalized Optimization Framework\",\"authors\":\"Chuan Shi, Meiqi Zhu, Yue Yu, Xiao Wang, Junping Du\",\"doi\":\"10.1145/3660852\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Graph Neural Networks (GNNs) have received considerable attention on graph-structured data learning for a wide variety of tasks. The well-designed propagation mechanism, which has been demonstrated effective, is the most fundamental part of GNNs. Although most of the GNNs basically follow a message passing manner, little effort has been made to discover and analyze their essential relations. In this paper, we establish a surprising connection between different propagation mechanisms with an optimization problem. We show that despite the proliferation of various GNNs, in fact, their proposed propagation mechanisms are the optimal solutions of a generalized optimization framework with a flexible feature fitting function and a generalized graph regularization term. Actually, the optimization framework can not only help understand the propagation mechanisms of GNNs, but also open up opportunities for flexibly designing new GNNs. Through analyzing the general solutions of the optimization framework, we provide a more convenient way for deriving corresponding propagation results of GNNs. We further discover that existing works usually utilize naïve graph convolutional kernels for feature fitting function, or just utilize one-hop structural information (original topology graph) for graph regularization term. Correspondingly, we develop two novel objective functions considering adjustable graph kernels showing low-pass or high-pass filtering capabilities and one novel objective function considering high-order structural information during propagation respectively. Extensive experiments on benchmark datasets clearly show that the newly proposed GNNs not only outperform the state-of-the-art methods but also have good ability to alleviate over-smoothing, and further verify the feasibility for designing GNNs with the generalized unified optimization framework.\",\"PeriodicalId\":50936,\"journal\":{\"name\":\"ACM Transactions on Information Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.4000,\"publicationDate\":\"2024-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on Information Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1145/3660852\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Information Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3660852","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

图神经网络(GNN)在针对各种任务的图结构数据学习方面受到了广泛关注。精心设计的传播机制是图神经网络最基本的部分,已被证明行之有效。虽然大多数 GNN 基本遵循消息传递方式,但人们很少努力去发现和分析它们之间的本质关系。在本文中,我们在不同的传播机制与优化问题之间建立了惊人的联系。我们的研究表明,尽管各种 GNN 层出不穷,但事实上,它们所提出的传播机制都是带有灵活特征拟合函数和广义图正则化项的广义优化框架的最优解。实际上,优化框架不仅有助于理解 GNN 的传播机制,还为灵活设计新的 GNN 提供了机会。通过分析优化框架的一般解,我们为推导 GNN 的相应传播结果提供了更便捷的方法。我们进一步发现,现有研究通常利用天真图卷积核作为特征拟合函数,或仅利用单跳结构信息(原始拓扑图)作为图正则化项。相应地,我们开发了两个新的目标函数,考虑到可调整的图核,显示出低通滤波器或高通滤波器的能力,以及一个新的目标函数,在传播过程中分别考虑到高阶结构信息。在基准数据集上进行的大量实验清楚地表明,新提出的 GNN 不仅优于最先进的方法,而且具有良好的缓解过度平滑的能力,并进一步验证了利用广义统一优化框架设计 GNN 的可行性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Unifying Graph Neural Networks with a Generalized Optimization Framework
Graph Neural Networks (GNNs) have received considerable attention on graph-structured data learning for a wide variety of tasks. The well-designed propagation mechanism, which has been demonstrated effective, is the most fundamental part of GNNs. Although most of the GNNs basically follow a message passing manner, little effort has been made to discover and analyze their essential relations. In this paper, we establish a surprising connection between different propagation mechanisms with an optimization problem. We show that despite the proliferation of various GNNs, in fact, their proposed propagation mechanisms are the optimal solutions of a generalized optimization framework with a flexible feature fitting function and a generalized graph regularization term. Actually, the optimization framework can not only help understand the propagation mechanisms of GNNs, but also open up opportunities for flexibly designing new GNNs. Through analyzing the general solutions of the optimization framework, we provide a more convenient way for deriving corresponding propagation results of GNNs. We further discover that existing works usually utilize naïve graph convolutional kernels for feature fitting function, or just utilize one-hop structural information (original topology graph) for graph regularization term. Correspondingly, we develop two novel objective functions considering adjustable graph kernels showing low-pass or high-pass filtering capabilities and one novel objective function considering high-order structural information during propagation respectively. Extensive experiments on benchmark datasets clearly show that the newly proposed GNNs not only outperform the state-of-the-art methods but also have good ability to alleviate over-smoothing, and further verify the feasibility for designing GNNs with the generalized unified optimization framework.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACM Transactions on Information Systems
ACM Transactions on Information Systems 工程技术-计算机:信息系统
CiteScore
9.40
自引率
14.30%
发文量
165
审稿时长
>12 weeks
期刊介绍: The ACM Transactions on Information Systems (TOIS) publishes papers on information retrieval (such as search engines, recommender systems) that contain: new principled information retrieval models or algorithms with sound empirical validation; observational, experimental and/or theoretical studies yielding new insights into information retrieval or information seeking; accounts of applications of existing information retrieval techniques that shed light on the strengths and weaknesses of the techniques; formalization of new information retrieval or information seeking tasks and of methods for evaluating the performance on those tasks; development of content (text, image, speech, video, etc) analysis methods to support information retrieval and information seeking; development of computational models of user information preferences and interaction behaviors; creation and analysis of evaluation methodologies for information retrieval and information seeking; or surveys of existing work that propose a significant synthesis. The information retrieval scope of ACM Transactions on Information Systems (TOIS) appeals to industry practitioners for its wealth of creative ideas, and to academic researchers for its descriptions of their colleagues'' work.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信