Hepeng Gao , Funing Yang , Yongjian Yang , Yuanbo Xu , Yijun Su
{"title":"自适应感受野图神经网络","authors":"Hepeng Gao , Funing Yang , Yongjian Yang , Yuanbo Xu , Yijun Su","doi":"10.1016/j.neunet.2025.107658","DOIUrl":null,"url":null,"abstract":"<div><div>Graph Neural Networks (GNNs) have drawn increasing attention in recent years and achieved outstanding success in many scenarios and tasks. However, existing methods indicate that the performance of representation learning drops dramatically as GNNs deepen, which is attributed to <strong>over-smoothing representation</strong>. To handle the above issue, we propose an adaptive receptive field graph neural network (ADRP-GNN) that aggregates information by adaptively expanding receptive fields with a monolayer graph convolution layer, avoiding deepening to result in the over-smoothing issue. Specifically, we first present a Multi-hop Graph Convolution Network (MuGC) that captures the information of the nodes and their multi-hop neighbors with only one layer, preventing frequent passing messages between nodes from the over-smoothing issue. Then, we design a Meta Learner that realizes the adaptive receptive field for each node to select related neighbor information. Finally, a Backbone Network is employed to enhance the architecture’s learning ability. In addition, our architecture adaptively generates receptive fields instead of handcrafting stacked layers, which can integrate existing GNN frameworks to fit various scenarios. Extensive experiments indicate that our architecture is effective for the over-smoothing issue and improves accuracy by 0.52% to 6.88% compared to state-of-the-art methods on node classification tasks on eight datasets.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107658"},"PeriodicalIF":6.0000,"publicationDate":"2025-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Adaptive receptive field graph neural networks\",\"authors\":\"Hepeng Gao , Funing Yang , Yongjian Yang , Yuanbo Xu , Yijun Su\",\"doi\":\"10.1016/j.neunet.2025.107658\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Graph Neural Networks (GNNs) have drawn increasing attention in recent years and achieved outstanding success in many scenarios and tasks. However, existing methods indicate that the performance of representation learning drops dramatically as GNNs deepen, which is attributed to <strong>over-smoothing representation</strong>. To handle the above issue, we propose an adaptive receptive field graph neural network (ADRP-GNN) that aggregates information by adaptively expanding receptive fields with a monolayer graph convolution layer, avoiding deepening to result in the over-smoothing issue. Specifically, we first present a Multi-hop Graph Convolution Network (MuGC) that captures the information of the nodes and their multi-hop neighbors with only one layer, preventing frequent passing messages between nodes from the over-smoothing issue. Then, we design a Meta Learner that realizes the adaptive receptive field for each node to select related neighbor information. Finally, a Backbone Network is employed to enhance the architecture’s learning ability. In addition, our architecture adaptively generates receptive fields instead of handcrafting stacked layers, which can integrate existing GNN frameworks to fit various scenarios. Extensive experiments indicate that our architecture is effective for the over-smoothing issue and improves accuracy by 0.52% to 6.88% compared to state-of-the-art methods on node classification tasks on eight datasets.</div></div>\",\"PeriodicalId\":49763,\"journal\":{\"name\":\"Neural Networks\",\"volume\":\"190 \",\"pages\":\"Article 107658\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Networks\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0893608025005386\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025005386","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Graph Neural Networks (GNNs) have drawn increasing attention in recent years and achieved outstanding success in many scenarios and tasks. However, existing methods indicate that the performance of representation learning drops dramatically as GNNs deepen, which is attributed to over-smoothing representation. To handle the above issue, we propose an adaptive receptive field graph neural network (ADRP-GNN) that aggregates information by adaptively expanding receptive fields with a monolayer graph convolution layer, avoiding deepening to result in the over-smoothing issue. Specifically, we first present a Multi-hop Graph Convolution Network (MuGC) that captures the information of the nodes and their multi-hop neighbors with only one layer, preventing frequent passing messages between nodes from the over-smoothing issue. Then, we design a Meta Learner that realizes the adaptive receptive field for each node to select related neighbor information. Finally, a Backbone Network is employed to enhance the architecture’s learning ability. In addition, our architecture adaptively generates receptive fields instead of handcrafting stacked layers, which can integrate existing GNN frameworks to fit various scenarios. Extensive experiments indicate that our architecture is effective for the over-smoothing issue and improves accuracy by 0.52% to 6.88% compared to state-of-the-art methods on node classification tasks on eight datasets.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.