{"title":"基于知识图和大语言模型的汉语潜在歧义消歧方法","authors":"Dan Zhang , Delong Jia","doi":"10.1016/j.aej.2025.04.089","DOIUrl":null,"url":null,"abstract":"<div><div>Traditional disambiguation methods struggle to effectively balance and integrate a wide range of contextual information and world knowledge when dealing with potential ambiguities in Chinese. To address this issue, this paper proposes a disambiguation model that integrates knowledge graphs and large language models (LLMs) to tackle lexical ambiguity in Chinese texts. This article uses an attention based disambiguation model, which is fine-tuned using multiple hyperparameter configurations. It optimizes network layers and knowledge graph embedding dimensions to enhance performance. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. Experiments conducted on a dataset comprising 200,000 sentences demonstrate significant improvements in accuracy and F1 scores, reaching 92.4 % and 91.9 %, respectively, compared to traditional statistical and deep learning models. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. The findings suggest that integrating knowledge graphs with LLMs offers an innovative approach to complex language tasks. In practical applications such as machine translation and chatbots, this model is expected to enhance both performance and interpretability.</div></div>","PeriodicalId":7484,"journal":{"name":"alexandria engineering journal","volume":"126 ","pages":"Pages 293-302"},"PeriodicalIF":6.2000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A disambiguation method for potential ambiguities in Chinese based on knowledge graphs and large language model\",\"authors\":\"Dan Zhang , Delong Jia\",\"doi\":\"10.1016/j.aej.2025.04.089\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Traditional disambiguation methods struggle to effectively balance and integrate a wide range of contextual information and world knowledge when dealing with potential ambiguities in Chinese. To address this issue, this paper proposes a disambiguation model that integrates knowledge graphs and large language models (LLMs) to tackle lexical ambiguity in Chinese texts. This article uses an attention based disambiguation model, which is fine-tuned using multiple hyperparameter configurations. It optimizes network layers and knowledge graph embedding dimensions to enhance performance. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. Experiments conducted on a dataset comprising 200,000 sentences demonstrate significant improvements in accuracy and F1 scores, reaching 92.4 % and 91.9 %, respectively, compared to traditional statistical and deep learning models. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. The findings suggest that integrating knowledge graphs with LLMs offers an innovative approach to complex language tasks. In practical applications such as machine translation and chatbots, this model is expected to enhance both performance and interpretability.</div></div>\",\"PeriodicalId\":7484,\"journal\":{\"name\":\"alexandria engineering journal\",\"volume\":\"126 \",\"pages\":\"Pages 293-302\"},\"PeriodicalIF\":6.2000,\"publicationDate\":\"2025-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"alexandria engineering journal\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1110016825005861\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"alexandria engineering journal","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1110016825005861","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
A disambiguation method for potential ambiguities in Chinese based on knowledge graphs and large language model
Traditional disambiguation methods struggle to effectively balance and integrate a wide range of contextual information and world knowledge when dealing with potential ambiguities in Chinese. To address this issue, this paper proposes a disambiguation model that integrates knowledge graphs and large language models (LLMs) to tackle lexical ambiguity in Chinese texts. This article uses an attention based disambiguation model, which is fine-tuned using multiple hyperparameter configurations. It optimizes network layers and knowledge graph embedding dimensions to enhance performance. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. Experiments conducted on a dataset comprising 200,000 sentences demonstrate significant improvements in accuracy and F1 scores, reaching 92.4 % and 91.9 %, respectively, compared to traditional statistical and deep learning models. Visualization of the attention mechanism reveals the model's focus on target words, context, and knowledge graph entities. The findings suggest that integrating knowledge graphs with LLMs offers an innovative approach to complex language tasks. In practical applications such as machine translation and chatbots, this model is expected to enhance both performance and interpretability.
期刊介绍:
Alexandria Engineering Journal is an international journal devoted to publishing high quality papers in the field of engineering and applied science. Alexandria Engineering Journal is cited in the Engineering Information Services (EIS) and the Chemical Abstracts (CA). The papers published in Alexandria Engineering Journal are grouped into five sections, according to the following classification:
• Mechanical, Production, Marine and Textile Engineering
• Electrical Engineering, Computer Science and Nuclear Engineering
• Civil and Architecture Engineering
• Chemical Engineering and Applied Sciences
• Environmental Engineering