{"title":"SparseMult:知识图链接预测的稀疏张量分解模型","authors":"Zhiwen Xie, Runjie Zhu, Meng Zhang, Jin Liu","doi":"10.1111/coin.70097","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>Knowledge graphs (KGs) have shown great power in many downstream natural language processing (NLP) tasks, such as recommendation system and question answering. Despite the large amount of knowledge facts in KGs, KGs still suffer from an issue of incompleteness, namely, lots of relations between entities are missing. Link prediction, also known as knowledge graph completion (KGC), aims to predict missing relations between entities. The models based on tensor decomposition, such as Rescal and DistMult, are promising to solve the link prediction task. However, previous Rescal model lacks the ability to scale to large KGs due to the large amount of parameters. DistMult simplifies Rescal by using diagonal matrices to represent relations, while it suffers from the limitation of dealing with antisymmetric relations. To address these problems, in this paper, we propose a SparseMult model, which is a novel tensor decomposition model based on sparse relation matrix. Specifically, we view KGs as 3D tensors and decompose them as entity vectors and relation matrices. To reduce the number of parameters in relation matrices, we represent each relation matrix as a sparse block diagonal matrix. Thus, the complexity of relation matrices grow linearly with the embedding size, making it able to scale up to large KGs. Moreover, we analyze the ability of modeling different relation patterns and show that our SparseMult is capable to model symmetry, antisymmetry, and inversion relations. We conduct extensive experiments on three widely used benchmark datasets FB15k-237, WN18RR, and CCKS2021 KGs. Experimental results demonstrate that our SparseMult model outperforms most of the state-of-the-art methods.</p>\n </div>","PeriodicalId":55228,"journal":{"name":"Computational Intelligence","volume":"41 4","pages":""},"PeriodicalIF":1.8000,"publicationDate":"2025-07-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SparseMult: A Sparse Tensor Decomposition Model for Knowledge Graph Link Prediction\",\"authors\":\"Zhiwen Xie, Runjie Zhu, Meng Zhang, Jin Liu\",\"doi\":\"10.1111/coin.70097\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div>\\n \\n <p>Knowledge graphs (KGs) have shown great power in many downstream natural language processing (NLP) tasks, such as recommendation system and question answering. Despite the large amount of knowledge facts in KGs, KGs still suffer from an issue of incompleteness, namely, lots of relations between entities are missing. Link prediction, also known as knowledge graph completion (KGC), aims to predict missing relations between entities. The models based on tensor decomposition, such as Rescal and DistMult, are promising to solve the link prediction task. However, previous Rescal model lacks the ability to scale to large KGs due to the large amount of parameters. DistMult simplifies Rescal by using diagonal matrices to represent relations, while it suffers from the limitation of dealing with antisymmetric relations. To address these problems, in this paper, we propose a SparseMult model, which is a novel tensor decomposition model based on sparse relation matrix. Specifically, we view KGs as 3D tensors and decompose them as entity vectors and relation matrices. To reduce the number of parameters in relation matrices, we represent each relation matrix as a sparse block diagonal matrix. Thus, the complexity of relation matrices grow linearly with the embedding size, making it able to scale up to large KGs. Moreover, we analyze the ability of modeling different relation patterns and show that our SparseMult is capable to model symmetry, antisymmetry, and inversion relations. We conduct extensive experiments on three widely used benchmark datasets FB15k-237, WN18RR, and CCKS2021 KGs. Experimental results demonstrate that our SparseMult model outperforms most of the state-of-the-art methods.</p>\\n </div>\",\"PeriodicalId\":55228,\"journal\":{\"name\":\"Computational Intelligence\",\"volume\":\"41 4\",\"pages\":\"\"},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2025-07-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Intelligence\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1111/coin.70097\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Intelligence","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1111/coin.70097","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
SparseMult: A Sparse Tensor Decomposition Model for Knowledge Graph Link Prediction
Knowledge graphs (KGs) have shown great power in many downstream natural language processing (NLP) tasks, such as recommendation system and question answering. Despite the large amount of knowledge facts in KGs, KGs still suffer from an issue of incompleteness, namely, lots of relations between entities are missing. Link prediction, also known as knowledge graph completion (KGC), aims to predict missing relations between entities. The models based on tensor decomposition, such as Rescal and DistMult, are promising to solve the link prediction task. However, previous Rescal model lacks the ability to scale to large KGs due to the large amount of parameters. DistMult simplifies Rescal by using diagonal matrices to represent relations, while it suffers from the limitation of dealing with antisymmetric relations. To address these problems, in this paper, we propose a SparseMult model, which is a novel tensor decomposition model based on sparse relation matrix. Specifically, we view KGs as 3D tensors and decompose them as entity vectors and relation matrices. To reduce the number of parameters in relation matrices, we represent each relation matrix as a sparse block diagonal matrix. Thus, the complexity of relation matrices grow linearly with the embedding size, making it able to scale up to large KGs. Moreover, we analyze the ability of modeling different relation patterns and show that our SparseMult is capable to model symmetry, antisymmetry, and inversion relations. We conduct extensive experiments on three widely used benchmark datasets FB15k-237, WN18RR, and CCKS2021 KGs. Experimental results demonstrate that our SparseMult model outperforms most of the state-of-the-art methods.
期刊介绍:
This leading international journal promotes and stimulates research in the field of artificial intelligence (AI). Covering a wide range of issues - from the tools and languages of AI to its philosophical implications - Computational Intelligence provides a vigorous forum for the publication of both experimental and theoretical research, as well as surveys and impact studies. The journal is designed to meet the needs of a wide range of AI workers in academic and industrial research.