Zhengyun Zhou, Guojia Wan, Shirui Pan, Jia Wu, Wenbin Hu, Bo Du
{"title":"使用谎言群上的区域嵌入对知识图谱基础模型进行复杂查询回答","authors":"Zhengyun Zhou, Guojia Wan, Shirui Pan, Jia Wu, Wenbin Hu, Bo Du","doi":"10.1007/s11280-024-01254-7","DOIUrl":null,"url":null,"abstract":"<p>Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (<span>\\(\\wedge \\)</span>), disjunction (<span>\\(\\vee \\)</span>), and negation (<span>\\(\\lnot \\)</span>) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of <span>\\(\\mathbb {R}^n\\)</span>. The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.</p>","PeriodicalId":501180,"journal":{"name":"World Wide Web","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-04-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Complex query answering over knowledge graphs foundation model using region embeddings on a lie group\",\"authors\":\"Zhengyun Zhou, Guojia Wan, Shirui Pan, Jia Wu, Wenbin Hu, Bo Du\",\"doi\":\"10.1007/s11280-024-01254-7\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (<span>\\\\(\\\\wedge \\\\)</span>), disjunction (<span>\\\\(\\\\vee \\\\)</span>), and negation (<span>\\\\(\\\\lnot \\\\)</span>) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of <span>\\\\(\\\\mathbb {R}^n\\\\)</span>. The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.</p>\",\"PeriodicalId\":501180,\"journal\":{\"name\":\"World Wide Web\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-04-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"World Wide Web\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s11280-024-01254-7\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"World Wide Web","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s11280-024-01254-7","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Complex query answering over knowledge graphs foundation model using region embeddings on a lie group
Answering complex queries with First-order logical operators over knowledge graphs, such as conjunction (\(\wedge \)), disjunction (\(\vee \)), and negation (\(\lnot \)) is immensely useful for identifying missing knowledge. Recently, neural symbolic reasoning methods have been proposed to map entities and relations into a continuous real vector space and model logical operators as differential neural networks. However, traditional methodss employ negative sampling, which corrupts complex queries to train embeddings. Consequently, these embeddings are susceptible to divergence in the open manifold of \(\mathbb {R}^n\). The appropriate regularization is crucial for addressing the divergence of embeddings. In this paper, we introduces a Lie group as a compact embedding space for complex query embedding, enhancing ability to handle the intricacies of knowledge graphs the foundation model. Our method aims to solve the query of disjunctive and conjunctive problems. Entities and queries are represented as a region of a high-dimensional torus, where the projection, intersection, union, and negation of the torus naturally simulate entities and queries. After simulating the operations on the region of the torus we defined, we found that the resulting geometry remains unchanged. Experiments show that our method achieved a significant improvement on FB15K, FB15K-237, and NELL995. Through extensive experiments on datasets FB15K, FB15K-237, and NELL995, our approach demonstrates significant improvements, leveraging the strengths of knowledge graphs foundation model and complex query processing.