{"title":"Attention-Based Complex Logical Query on Temporal Knowledge Graph via Graph Neural Network","authors":"Luyi Bai;Linshuo Xu;Lin Zhu","doi":"10.1109/TBDATA.2024.3489421","DOIUrl":null,"url":null,"abstract":"Answering complex logical queries on large-scale Knowledge Graphs (KGs) efficiently and accurately has always been crucial for question-answering systems. Recent studies have significantly improved the performance of complex logical queries on massive knowledge graphs by leveraging graph neural networks (GNNs). However, the existing GNN-based methods still have limitations in dealing with long-sequence logical queries. They usually decompose complex queries into multiple independent first-order logical queries, which leads to the inability to optimize globally, and the query accuracy will drop sharply with the increase of query length. In addition, the knowlege in the real world is dynamically changing, but most of the existing methods are more suitable for dealing with static knowledge graphs, and there is still much room for improvement when dealing with logical queries in temporal knowledge graphs. In this paper, we propose a novel Temporal Complex Logical Query (TCLQ) model to achieve temporal logical queries on temporal knowledge graphs. We add time series embedding into GNN, and use multi-layer GRUs to aggregate the node features of previous time and current time, which effectively enhances the time series reasoning ability of the model. In order to solve the problem that the accuracy of logical query model decreases significantly with the increase of query sequence length, we establish a multi-level attention coefficients model to learn and optimize the whole logical queries, thus reducing the error accumulation problem when the queries are decomposed into multiple independent first-order logical queries. We conduct experiments on multiple temporal datasets and demonstrate the effectiveness of TCLQ.","PeriodicalId":13106,"journal":{"name":"IEEE Transactions on Big Data","volume":"11 4","pages":"1828-1839"},"PeriodicalIF":5.7000,"publicationDate":"2024-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Big Data","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10740050/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Answering complex logical queries on large-scale Knowledge Graphs (KGs) efficiently and accurately has always been crucial for question-answering systems. Recent studies have significantly improved the performance of complex logical queries on massive knowledge graphs by leveraging graph neural networks (GNNs). However, the existing GNN-based methods still have limitations in dealing with long-sequence logical queries. They usually decompose complex queries into multiple independent first-order logical queries, which leads to the inability to optimize globally, and the query accuracy will drop sharply with the increase of query length. In addition, the knowlege in the real world is dynamically changing, but most of the existing methods are more suitable for dealing with static knowledge graphs, and there is still much room for improvement when dealing with logical queries in temporal knowledge graphs. In this paper, we propose a novel Temporal Complex Logical Query (TCLQ) model to achieve temporal logical queries on temporal knowledge graphs. We add time series embedding into GNN, and use multi-layer GRUs to aggregate the node features of previous time and current time, which effectively enhances the time series reasoning ability of the model. In order to solve the problem that the accuracy of logical query model decreases significantly with the increase of query sequence length, we establish a multi-level attention coefficients model to learn and optimize the whole logical queries, thus reducing the error accumulation problem when the queries are decomposed into multiple independent first-order logical queries. We conduct experiments on multiple temporal datasets and demonstrate the effectiveness of TCLQ.
期刊介绍:
The IEEE Transactions on Big Data publishes peer-reviewed articles focusing on big data. These articles present innovative research ideas and application results across disciplines, including novel theories, algorithms, and applications. Research areas cover a wide range, such as big data analytics, visualization, curation, management, semantics, infrastructure, standards, performance analysis, intelligence extraction, scientific discovery, security, privacy, and legal issues specific to big data. The journal also prioritizes applications of big data in fields generating massive datasets.