Dynamic sequential neighbor processing: A liquid neural network-inspired framework for enhanced graph neural networks

IF 8.1 1区 计算机科学 0 COMPUTER SCIENCE, INFORMATION SYSTEMS
Kuijie Zhang , Shanchen Pang , Yuanyuan Zhang , Yun Bai , Luqi Wang , Jerry Chun-Wei Lin
{"title":"Dynamic sequential neighbor processing: A liquid neural network-inspired framework for enhanced graph neural networks","authors":"Kuijie Zhang ,&nbsp;Shanchen Pang ,&nbsp;Yuanyuan Zhang ,&nbsp;Yun Bai ,&nbsp;Luqi Wang ,&nbsp;Jerry Chun-Wei Lin","doi":"10.1016/j.ins.2025.122452","DOIUrl":null,"url":null,"abstract":"<div><div>Integrating information from multi-order neighborhoods is a fundamental strategy in Graph Neural Networks (GNNs) for capturing higher-order structural patterns and enhancing the expressive power of node representations. However, most existing GNNs treat neighbors from different orders as unordered sets and integrate them using static or parallel strategies, thus overlooking the sequential and evolving nature of neighborhood expansion. To address this limitation, we propose a novel GNN framework, SL, which integrates <strong>Serialized Neighbor Features</strong> with <strong>Liquid Neural Networks</strong> (LNNs) to enable order-aware, dynamic adaptation of neighbor influence. By modeling neighbor features as ordered sequences and leveraging LNNs' internal feedback dynamics, SL adapts feature extraction in real time based on local context and propagation history. This design offers fine-grained control over hierarchical dependencies and allows dynamic modulation of contributions from different neighborhood layers. SL is model-agnostic and can be seamlessly integrated with both classical and state-of-the-art GNNs. Extensive experiments across ten benchmark datasets show that SL consistently improves node classification accuracy and significantly alleviates over-smoothing in deep GNNs. These results highlight that order-aware and dynamically regulated propagation represents a powerful, flexible alternative to traditional multi-order aggregation, enhancing the adaptability and expressiveness of GNNs for complex graph learning tasks.</div></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":"719 ","pages":"Article 122452"},"PeriodicalIF":8.1000,"publicationDate":"2025-06-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025525005845","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"0","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Integrating information from multi-order neighborhoods is a fundamental strategy in Graph Neural Networks (GNNs) for capturing higher-order structural patterns and enhancing the expressive power of node representations. However, most existing GNNs treat neighbors from different orders as unordered sets and integrate them using static or parallel strategies, thus overlooking the sequential and evolving nature of neighborhood expansion. To address this limitation, we propose a novel GNN framework, SL, which integrates Serialized Neighbor Features with Liquid Neural Networks (LNNs) to enable order-aware, dynamic adaptation of neighbor influence. By modeling neighbor features as ordered sequences and leveraging LNNs' internal feedback dynamics, SL adapts feature extraction in real time based on local context and propagation history. This design offers fine-grained control over hierarchical dependencies and allows dynamic modulation of contributions from different neighborhood layers. SL is model-agnostic and can be seamlessly integrated with both classical and state-of-the-art GNNs. Extensive experiments across ten benchmark datasets show that SL consistently improves node classification accuracy and significantly alleviates over-smoothing in deep GNNs. These results highlight that order-aware and dynamically regulated propagation represents a powerful, flexible alternative to traditional multi-order aggregation, enhancing the adaptability and expressiveness of GNNs for complex graph learning tasks.
动态顺序邻居处理:一种基于液体神经网络的增强图神经网络框架
多阶邻域信息集成是图神经网络(gnn)捕获高阶结构模式和增强节点表示表达能力的基本策略。然而,大多数现有gnn将不同阶数的邻域视为无序集合,并使用静态或并行策略对其进行整合,从而忽略了邻域扩展的顺序和演化性质。为了解决这一限制,我们提出了一种新的GNN框架SL,它将序列化邻居特征与液体神经网络(LNNs)集成在一起,以实现对邻居影响的顺序感知和动态适应。通过将邻居特征建模为有序序列并利用LNNs的内部反馈动态,SL根据本地上下文和传播历史实时调整特征提取。这种设计提供了对层次依赖性的细粒度控制,并允许对来自不同邻域层的贡献进行动态调制。SL是模型不可知的,可以与经典和最先进的gnn无缝集成。在10个基准数据集上进行的大量实验表明,SL持续提高了节点分类精度,并显著缓解了深度gnn中的过度平滑。这些结果强调了顺序感知和动态调节传播是传统多阶聚合的强大、灵活的替代方案,增强了gnn对复杂图学习任务的适应性和表达性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Information Sciences
Information Sciences 工程技术-计算机:信息系统
CiteScore
14.00
自引率
17.30%
发文量
1322
审稿时长
10.4 months
期刊介绍: Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions. Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信