{"title":"A hierarchical and interlamination graph self-attention mechanism-based knowledge graph reasoning architecture","authors":"","doi":"10.1016/j.ins.2024.121345","DOIUrl":null,"url":null,"abstract":"<div><p>Knowledge Graph (KG) is an essential research field in graph theory, but its inherent incompleteness and sparsity influence its performance in several fields. Knowledge Graph Reasoning (KGR) aims to ameliorate those problems by mining new knowledge from subsistent knowledge. As one of the downstream tasks of KGR, link prediction is of great significance for improving the quality of KG. Recently, the Graph Neural Network (GNN)-based method became the most effective way to achieve the link prediction task. However, it still suffers from problems such as incomplete neighbor and relation-level information aggregation and unstable learning of the entity's features. To improve those issues, a Hierarchical and Interlamination Graph Self-attention Mechanism-based (HIGSM) plug-and-play architecture is proposed for KGR in this paper. It is composed of three-level layers: feature extractor, encoder, and decoder. The feature extractor makes our architecture more effective and stable for the retrieval of new features. The encoder is equipped with a two-stage encoding mechanism accompanied by two mixture-of-expert strategies, which enables our architecture to capture more practical reasoning information to improve prediction accuracy and generalization of the model. The decoder can use existing KGR models and compute the scores of triples in KG. The extensive experimental results and ablation studies on four KGs unambiguously demonstrate the state-of-the-art prediction performance of the proposed HIGSM architecture compared to current GNN-based methods.</p></div>","PeriodicalId":51063,"journal":{"name":"Information Sciences","volume":null,"pages":null},"PeriodicalIF":8.1000,"publicationDate":"2024-08-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Information Sciences","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0020025524012593","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"N/A","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0
Abstract
Knowledge Graph (KG) is an essential research field in graph theory, but its inherent incompleteness and sparsity influence its performance in several fields. Knowledge Graph Reasoning (KGR) aims to ameliorate those problems by mining new knowledge from subsistent knowledge. As one of the downstream tasks of KGR, link prediction is of great significance for improving the quality of KG. Recently, the Graph Neural Network (GNN)-based method became the most effective way to achieve the link prediction task. However, it still suffers from problems such as incomplete neighbor and relation-level information aggregation and unstable learning of the entity's features. To improve those issues, a Hierarchical and Interlamination Graph Self-attention Mechanism-based (HIGSM) plug-and-play architecture is proposed for KGR in this paper. It is composed of three-level layers: feature extractor, encoder, and decoder. The feature extractor makes our architecture more effective and stable for the retrieval of new features. The encoder is equipped with a two-stage encoding mechanism accompanied by two mixture-of-expert strategies, which enables our architecture to capture more practical reasoning information to improve prediction accuracy and generalization of the model. The decoder can use existing KGR models and compute the scores of triples in KG. The extensive experimental results and ablation studies on four KGs unambiguously demonstrate the state-of-the-art prediction performance of the proposed HIGSM architecture compared to current GNN-based methods.
期刊介绍:
Informatics and Computer Science Intelligent Systems Applications is an esteemed international journal that focuses on publishing original and creative research findings in the field of information sciences. We also feature a limited number of timely tutorial and surveying contributions.
Our journal aims to cater to a diverse audience, including researchers, developers, managers, strategic planners, graduate students, and anyone interested in staying up-to-date with cutting-edge research in information science, knowledge engineering, and intelligent systems. While readers are expected to share a common interest in information science, they come from varying backgrounds such as engineering, mathematics, statistics, physics, computer science, cell biology, molecular biology, management science, cognitive science, neurobiology, behavioral sciences, and biochemistry.