Graph node classification with soft-flow convolution and linear-complexity attention mechanism

IF 3.1 3区 计算机科学 Q2 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Xianming Huang , Yang Yan (闫旸) , Qiuyan Wang , Haoyu Pan , Hanning Chen , Xingguo Liu
{"title":"Graph node classification with soft-flow convolution and linear-complexity attention mechanism","authors":"Xianming Huang ,&nbsp;Yang Yan (闫旸) ,&nbsp;Qiuyan Wang ,&nbsp;Haoyu Pan ,&nbsp;Hanning Chen ,&nbsp;Xingguo Liu","doi":"10.1016/j.jocs.2025.102628","DOIUrl":null,"url":null,"abstract":"<div><div>Traditional Graph Neural Networks (GNNs) typically use a message-passing mechanism to aggregate information from neighboring nodes. This message-passing mechanism is analogous to diffusing messages, often resulting in the homogenization of node features. GNNs also tend to be ineffective at capturing features from distant nodes and learning the global structure of the graph, which can reduce performance in node classification tasks. To address these issues, this paper proposes a novel model—Enhanced Soft-Flow Graph Convolutional Network (ESAGCN) based on a global attention mechanism. This model defines a learnable, parameterized phase angle that allows the edge directions between nodes to change continuously, enabling features to flow between nodes. Additionally, it incorporates the self-attention mechanism from Transformers to capture global information within the graph network, enhancing the global representation of nodes. We also employ a simple kernel trick to reduce the complexity of the model’s global attention mechanism to linear complexity. Experimental results demonstrate that the integration of global and local information in graphs is crucial for the learning process of GNNs, especially in directed graphs, significantly improving the accuracy of node classification.</div></div>","PeriodicalId":48907,"journal":{"name":"Journal of Computational Science","volume":"90 ","pages":"Article 102628"},"PeriodicalIF":3.1000,"publicationDate":"2025-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Science","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S187775032500105X","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Traditional Graph Neural Networks (GNNs) typically use a message-passing mechanism to aggregate information from neighboring nodes. This message-passing mechanism is analogous to diffusing messages, often resulting in the homogenization of node features. GNNs also tend to be ineffective at capturing features from distant nodes and learning the global structure of the graph, which can reduce performance in node classification tasks. To address these issues, this paper proposes a novel model—Enhanced Soft-Flow Graph Convolutional Network (ESAGCN) based on a global attention mechanism. This model defines a learnable, parameterized phase angle that allows the edge directions between nodes to change continuously, enabling features to flow between nodes. Additionally, it incorporates the self-attention mechanism from Transformers to capture global information within the graph network, enhancing the global representation of nodes. We also employ a simple kernel trick to reduce the complexity of the model’s global attention mechanism to linear complexity. Experimental results demonstrate that the integration of global and local information in graphs is crucial for the learning process of GNNs, especially in directed graphs, significantly improving the accuracy of node classification.

Abstract Image

基于软流卷积和线性复杂度关注机制的图节点分类
传统的图神经网络(gnn)通常使用消息传递机制来聚合来自相邻节点的信息。这种消息传递机制类似于扩散消息,通常会导致节点特性的同质化。gnn在从远程节点捕获特征和学习图的全局结构方面也往往是无效的,这可能会降低节点分类任务的性能。为了解决这些问题,本文提出了一种基于全局注意机制的模型增强软流图卷积网络(ESAGCN)。该模型定义了一个可学习的参数化相角,允许节点之间的边缘方向连续变化,使特征在节点之间流动。此外,它还结合了变形金刚的自关注机制来捕获图网络中的全局信息,增强了节点的全局表示。我们还采用了一个简单的核技巧,将模型的全局关注机制的复杂性降低到线性复杂性。实验结果表明,图中全局和局部信息的整合对于gnn的学习过程至关重要,特别是在有向图中,可以显著提高节点分类的准确性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Computational Science
Journal of Computational Science COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS-COMPUTER SCIENCE, THEORY & METHODS
CiteScore
5.50
自引率
3.00%
发文量
227
审稿时长
41 days
期刊介绍: Computational Science is a rapidly growing multi- and interdisciplinary field that uses advanced computing and data analysis to understand and solve complex problems. It has reached a level of predictive capability that now firmly complements the traditional pillars of experimentation and theory. The recent advances in experimental techniques such as detectors, on-line sensor networks and high-resolution imaging techniques, have opened up new windows into physical and biological processes at many levels of detail. The resulting data explosion allows for detailed data driven modeling and simulation. This new discipline in science combines computational thinking, modern computational methods, devices and collateral technologies to address problems far beyond the scope of traditional numerical methods. Computational science typically unifies three distinct elements: • Modeling, Algorithms and Simulations (e.g. numerical and non-numerical, discrete and continuous); • Software developed to solve science (e.g., biological, physical, and social), engineering, medicine, and humanities problems; • Computer and information science that develops and optimizes the advanced system hardware, software, networking, and data management components (e.g. problem solving environments).
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信