{"title":"Addressing information bottlenecks in graph augmented large language models via graph neural summarization","authors":"Wooyoung Kim, Wooju Kim","doi":"10.1016/j.inffus.2025.103784","DOIUrl":null,"url":null,"abstract":"<div><div>This study investigates the problem of information bottlenecks in graph-level prompting, where compressing all node embeddings into a single vector leads to significant structural information loss. We clarify and systematically analyze this challenge, and propose the Graph Neural Summarizer (GNS), a continuous prompting framework that generates multiple query-aware prompt vectors to better preserve graph structure and improve context relevance. Experiments on ExplaGraphs, SceneGraphs, and WebQSP show that GNS consistently improves performance over strong graph-level prompting baselines. These findings emphasize the importance of addressing information bottlenecks when integrating graph-structured data with large language models. Implementation details and source code are publicly available at <span><span>https://github.com/timothy-coshin/GraphNeuralSummarizer</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50367,"journal":{"name":"Information Fusion","volume":"127 ","pages":"Article 103784"},"PeriodicalIF":15.5000,"publicationDate":"2025-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Fusion","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1566253525008462","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
This study investigates the problem of information bottlenecks in graph-level prompting, where compressing all node embeddings into a single vector leads to significant structural information loss. We clarify and systematically analyze this challenge, and propose the Graph Neural Summarizer (GNS), a continuous prompting framework that generates multiple query-aware prompt vectors to better preserve graph structure and improve context relevance. Experiments on ExplaGraphs, SceneGraphs, and WebQSP show that GNS consistently improves performance over strong graph-level prompting baselines. These findings emphasize the importance of addressing information bottlenecks when integrating graph-structured data with large language models. Implementation details and source code are publicly available at https://github.com/timothy-coshin/GraphNeuralSummarizer.
期刊介绍:
Information Fusion serves as a central platform for showcasing advancements in multi-sensor, multi-source, multi-process information fusion, fostering collaboration among diverse disciplines driving its progress. It is the leading outlet for sharing research and development in this field, focusing on architectures, algorithms, and applications. Papers dealing with fundamental theoretical analyses as well as those demonstrating their application to real-world problems will be welcome.