{"title":"Linear self-attention with multi-relational graph for knowledge graph completion","authors":"Weida Liu, Baohua Qiang, Ruidong Chen, Yuan Xie, Lirui Chen, Zhiqin Chen","doi":"10.1007/s10489-025-06592-1","DOIUrl":null,"url":null,"abstract":"<div><p>Knowledge graph completion (KGC) aims to infer missing facts based on the existing knowledge. Graph Convolutional Networks (GCNs) have gained significant traction due to their proficiency in effectively modeling graph structures, especially within the realm of Knowledge Graph Completion (KGC). In GCN-based KGC methodologies, GCNs are initially employed to generate comprehensive representations of entities, followed by the application of Knowledge Graph Embedding (KGE) models to elucidate the interactions among entities and relations. However, most GCN-based KGC models ignore the long-range pairwise relationships in the graph. To address these limitations and enhance KGC, we propose a model called Linear Self-Attention with Multi-Relational Graph Network (LTRGN). Specifically, this model merges GCN and linear self-attention to serve as the encoder. This model introduces a linear self-attention that can capture long-range node dependencies without introducing excessive computational overhead. Furthermore, we implement an attention mechanism designed to better assess the significance of various neighboring nodes relative to the source node. We demonstrate the effectiveness of the proposed LTRGN on the standard FB15k-237, WN18RR, Kinship, and UMLS datasets. On the dense graphs Kinship and UMLS, the MRR of our model improves by 1.3% and 4.1%, respectively, while Hits@1 increases by 1.7% and 6.4% compared to the best-performing model. The results show the efficacy of the model for the KGC task. The code is released at https://github.com/lixianqingliuyan/LTRGN.</p></div>","PeriodicalId":8041,"journal":{"name":"Applied Intelligence","volume":"55 10","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2025-05-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Intelligence","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10489-025-06592-1","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge graph completion (KGC) aims to infer missing facts based on the existing knowledge. Graph Convolutional Networks (GCNs) have gained significant traction due to their proficiency in effectively modeling graph structures, especially within the realm of Knowledge Graph Completion (KGC). In GCN-based KGC methodologies, GCNs are initially employed to generate comprehensive representations of entities, followed by the application of Knowledge Graph Embedding (KGE) models to elucidate the interactions among entities and relations. However, most GCN-based KGC models ignore the long-range pairwise relationships in the graph. To address these limitations and enhance KGC, we propose a model called Linear Self-Attention with Multi-Relational Graph Network (LTRGN). Specifically, this model merges GCN and linear self-attention to serve as the encoder. This model introduces a linear self-attention that can capture long-range node dependencies without introducing excessive computational overhead. Furthermore, we implement an attention mechanism designed to better assess the significance of various neighboring nodes relative to the source node. We demonstrate the effectiveness of the proposed LTRGN on the standard FB15k-237, WN18RR, Kinship, and UMLS datasets. On the dense graphs Kinship and UMLS, the MRR of our model improves by 1.3% and 4.1%, respectively, while Hits@1 increases by 1.7% and 6.4% compared to the best-performing model. The results show the efficacy of the model for the KGC task. The code is released at https://github.com/lixianqingliuyan/LTRGN.
期刊介绍:
With a focus on research in artificial intelligence and neural networks, this journal addresses issues involving solutions of real-life manufacturing, defense, management, government and industrial problems which are too complex to be solved through conventional approaches and require the simulation of intelligent thought processes, heuristics, applications of knowledge, and distributed and parallel processing. The integration of these multiple approaches in solving complex problems is of particular importance.
The journal presents new and original research and technological developments, addressing real and complex issues applicable to difficult problems. It provides a medium for exchanging scientific research and technological achievements accomplished by the international community.