{"title":"Construction of Question Answering System Based on English Pre-Trained Language Model Enhanced by Knowledge Graph","authors":"Pei Song","doi":"10.1016/j.procs.2025.04.256","DOIUrl":null,"url":null,"abstract":"<div><div>With the continuous development of technology, question answering systems based on pre trained language models have become an important component of intelligent applications. However, traditional question-answering systems often face challenges in terms of understanding accuracy and insufficient knowledge coverage when dealing with open-domain questions. To this end, this paper uses a method for building a question-answering system based on a knowledge graph-enhanced BERT pre-trained language model. First, this paper trains the basic model based on a large-scale BERT pre-trained language model. Then, this paper adopts knowledge graph technology to introduce structured knowledge information into the model by integrating domain-specific knowledge bases. Finally, in order to effectively integrate knowledge graph information, this paper uses graph neural network (GNN) to model graph data, and combines the self-attention mechanism in the BERT model to optimize the weighted fusion process of knowledge graph information. Experimental results show that the BERT model enhanced with knowledge graph performs well in multiple question-answering tasks. On the simple question of the first experiment, the average accuracy of the enhanced model increased from 84.6% of the standard BERT model to 89.8%, and the F1 score increased from 0.86 to 0.91. In complex reasoning tasks, the BERT+knowledge graph model demonstrates stronger reasoning ability and higher knowledge coverage. In the experimental conclusion, the introduction of the knowledge graph significantly improves the model’s reasoning ability and knowledge coverage, especially in professional field problems and multi-step reasoning tasks, the enhanced model shows stronger capabilities. This research provides a new construction method for question-answering systems, demonstrates the great potential of knowledge graphs in natural language processing, and has broad application prospects.</div></div>","PeriodicalId":20465,"journal":{"name":"Procedia Computer Science","volume":"261 ","pages":"Pages 647-655"},"PeriodicalIF":0.0000,"publicationDate":"2025-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Procedia Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1877050925013584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
With the continuous development of technology, question answering systems based on pre trained language models have become an important component of intelligent applications. However, traditional question-answering systems often face challenges in terms of understanding accuracy and insufficient knowledge coverage when dealing with open-domain questions. To this end, this paper uses a method for building a question-answering system based on a knowledge graph-enhanced BERT pre-trained language model. First, this paper trains the basic model based on a large-scale BERT pre-trained language model. Then, this paper adopts knowledge graph technology to introduce structured knowledge information into the model by integrating domain-specific knowledge bases. Finally, in order to effectively integrate knowledge graph information, this paper uses graph neural network (GNN) to model graph data, and combines the self-attention mechanism in the BERT model to optimize the weighted fusion process of knowledge graph information. Experimental results show that the BERT model enhanced with knowledge graph performs well in multiple question-answering tasks. On the simple question of the first experiment, the average accuracy of the enhanced model increased from 84.6% of the standard BERT model to 89.8%, and the F1 score increased from 0.86 to 0.91. In complex reasoning tasks, the BERT+knowledge graph model demonstrates stronger reasoning ability and higher knowledge coverage. In the experimental conclusion, the introduction of the knowledge graph significantly improves the model’s reasoning ability and knowledge coverage, especially in professional field problems and multi-step reasoning tasks, the enhanced model shows stronger capabilities. This research provides a new construction method for question-answering systems, demonstrates the great potential of knowledge graphs in natural language processing, and has broad application prospects.