Yueyang Pi , Yang Huang , Yongquan Shi , Fuhai Chen , Shiping Wang
{"title":"具有柔性传播算子的隐式图神经网络","authors":"Yueyang Pi , Yang Huang , Yongquan Shi , Fuhai Chen , Shiping Wang","doi":"10.1016/j.neunet.2025.108143","DOIUrl":null,"url":null,"abstract":"<div><div>Due to the capability to capture high-order information of nodes and reduce memory consumption, implicit graph neural networks have become an explored hotspot in recent years. However, these implicit graph neural networks are limited by the static topology, which makes it difficult to handle heterophilic graph-structured data. Furthermore, the existing methods inspired by optimization problem are limited by the explicit structure of graph neural networks, which makes it difficult to set an appropriate number of network layers to solve optimization problems. To address these issues, we propose an implicit graph neural network with flexible propagation operators in this paper. From the optimization objective function, we derive an implicit message passing formula with flexible propagation operators. Compared to the static operator, the proposed method that joints the dynamic semantic and topology of data is more applicable to heterophilic graphs. Moreover, the proposed model performs a fixed-point iterative process for the optimization of the objective function, which implicitly adjusts the number of network layers without requiring sufficient prior knowledge. Extensive experiment results demonstrate the superiority of the proposed model.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"194 ","pages":"Article 108143"},"PeriodicalIF":6.3000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Implicit graph neural networks with flexible propagation operators\",\"authors\":\"Yueyang Pi , Yang Huang , Yongquan Shi , Fuhai Chen , Shiping Wang\",\"doi\":\"10.1016/j.neunet.2025.108143\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Due to the capability to capture high-order information of nodes and reduce memory consumption, implicit graph neural networks have become an explored hotspot in recent years. However, these implicit graph neural networks are limited by the static topology, which makes it difficult to handle heterophilic graph-structured data. Furthermore, the existing methods inspired by optimization problem are limited by the explicit structure of graph neural networks, which makes it difficult to set an appropriate number of network layers to solve optimization problems. To address these issues, we propose an implicit graph neural network with flexible propagation operators in this paper. From the optimization objective function, we derive an implicit message passing formula with flexible propagation operators. Compared to the static operator, the proposed method that joints the dynamic semantic and topology of data is more applicable to heterophilic graphs. Moreover, the proposed model performs a fixed-point iterative process for the optimization of the objective function, which implicitly adjusts the number of network layers without requiring sufficient prior knowledge. Extensive experiment results demonstrate the superiority of the proposed model.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"194 \",\"pages\":\"Article 108143\"},\"PeriodicalIF\":6.3000,\"publicationDate\":\"2025-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025010238\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025010238","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Implicit graph neural networks with flexible propagation operators
Due to the capability to capture high-order information of nodes and reduce memory consumption, implicit graph neural networks have become an explored hotspot in recent years. However, these implicit graph neural networks are limited by the static topology, which makes it difficult to handle heterophilic graph-structured data. Furthermore, the existing methods inspired by optimization problem are limited by the explicit structure of graph neural networks, which makes it difficult to set an appropriate number of network layers to solve optimization problems. To address these issues, we propose an implicit graph neural network with flexible propagation operators in this paper. From the optimization objective function, we derive an implicit message passing formula with flexible propagation operators. Compared to the static operator, the proposed method that joints the dynamic semantic and topology of data is more applicable to heterophilic graphs. Moreover, the proposed model performs a fixed-point iterative process for the optimization of the objective function, which implicitly adjusts the number of network layers without requiring sufficient prior knowledge. Extensive experiment results demonstrate the superiority of the proposed model.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.