{"title":"KG-prompt: Interpretable knowledge graph prompt for pre-trained language models","authors":"Liyi Chen , Jie Liu , Yutai Duan , Runze Wang","doi":"10.1016/j.knosys.2025.113118","DOIUrl":null,"url":null,"abstract":"<div><div>Knowledge graphs (KGs) can provide rich factual knowledge for language models, enhancing reasoning ability and interpretability. However, existing knowledge injection methods usually ignore the structured information in KGs. Using structured knowledge to enhance pre-trained language models (PLMs) still has a set of challenging issues, including resource consumption of knowledge retraining, heterogeneous information, and knowledge noise. To address these issues, we explore how to flexibly inject structured knowledge into frozen PLMs. Inspired by prompt learning, we propose a novel method <strong>K</strong>nowledge <strong>G</strong>raph <strong>Prompt</strong> (KG-Prompt), which for the first time encodes the KG as structured prompts to enhance the knowledge expression ability of PLMs. KG-Prompt consists of a compressed subgraph construction module and a KG prompt generation module. In the compressed subgraph construction module, we construct compressed subgraphs based on a path-weighting strategy to reduce knowledge noise. In the KG prompt generation module, we propose a multi-hop consistency optimization strategy to learn the representation of compressed subgraphs, and then generate KG prompts based on a knowledge mapper to solve the heterogeneous information problem. The KG prompts can be inserted into the input of PLMs expediently, which decouples from PLMs and the downstream model without knowledge retraining and reduces computational resources. Extensive experiments on three knowledge-driven natural language understanding tasks demonstrate that our approach effectively improves the knowledge reasoning ability of PLMs. Furthermore, we provide a detailed analysis of different KG prompts and discuss the interpretability and generalizability of the proposed method.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"311 ","pages":"Article 113118"},"PeriodicalIF":7.2000,"publicationDate":"2025-02-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125001650","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge graphs (KGs) can provide rich factual knowledge for language models, enhancing reasoning ability and interpretability. However, existing knowledge injection methods usually ignore the structured information in KGs. Using structured knowledge to enhance pre-trained language models (PLMs) still has a set of challenging issues, including resource consumption of knowledge retraining, heterogeneous information, and knowledge noise. To address these issues, we explore how to flexibly inject structured knowledge into frozen PLMs. Inspired by prompt learning, we propose a novel method Knowledge Graph Prompt (KG-Prompt), which for the first time encodes the KG as structured prompts to enhance the knowledge expression ability of PLMs. KG-Prompt consists of a compressed subgraph construction module and a KG prompt generation module. In the compressed subgraph construction module, we construct compressed subgraphs based on a path-weighting strategy to reduce knowledge noise. In the KG prompt generation module, we propose a multi-hop consistency optimization strategy to learn the representation of compressed subgraphs, and then generate KG prompts based on a knowledge mapper to solve the heterogeneous information problem. The KG prompts can be inserted into the input of PLMs expediently, which decouples from PLMs and the downstream model without knowledge retraining and reduces computational resources. Extensive experiments on three knowledge-driven natural language understanding tasks demonstrate that our approach effectively improves the knowledge reasoning ability of PLMs. Furthermore, we provide a detailed analysis of different KG prompts and discuss the interpretability and generalizability of the proposed method.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.