{"title":"Representation learning of knowledge graphs using convolutional neural networks","authors":"WANG GAO, Y. Fang, F. Zhang, Z. Yang","doi":"10.14311/nnw.2020.30.011","DOIUrl":null,"url":null,"abstract":"Knowledge graphs have been playing an important role in many Artificial Intelligence (AI) applications such as entity linking, question answering and so forth. However, most of previous studies focused on the symbolic representation of knowledge graphs with structural information, which cannot deal well with new entities or rare entities with little relevant knowledge. In this paper, we propose a new deep knowledge representation architecture that jointly encodes both structure and textual information. We first propose a novel neural model to encode the text descriptions of entities based on Convolutional Neural Networks (CNN). Secondly, an attention mechanism is applied to capture the valuable information from these descriptions. Then we introduce position vectors as supplementary information. Finally, a gate mechanism is designed to integrate representations of structure and text into the joint representation. Experimental results on two datasets show that our models obtain state-of-the-art results on link prediction and triplet classification tasks, and achieve the best performance on the relation classification task.","PeriodicalId":49765,"journal":{"name":"Neural Network World","volume":"30 1","pages":"145-160"},"PeriodicalIF":0.7000,"publicationDate":"2020-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Network World","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.14311/nnw.2020.30.011","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 8
Abstract
Knowledge graphs have been playing an important role in many Artificial Intelligence (AI) applications such as entity linking, question answering and so forth. However, most of previous studies focused on the symbolic representation of knowledge graphs with structural information, which cannot deal well with new entities or rare entities with little relevant knowledge. In this paper, we propose a new deep knowledge representation architecture that jointly encodes both structure and textual information. We first propose a novel neural model to encode the text descriptions of entities based on Convolutional Neural Networks (CNN). Secondly, an attention mechanism is applied to capture the valuable information from these descriptions. Then we introduce position vectors as supplementary information. Finally, a gate mechanism is designed to integrate representations of structure and text into the joint representation. Experimental results on two datasets show that our models obtain state-of-the-art results on link prediction and triplet classification tasks, and achieve the best performance on the relation classification task.
期刊介绍:
Neural Network World is a bimonthly journal providing the latest developments in the field of informatics with attention mainly devoted to the problems of:
brain science,
theory and applications of neural networks (both artificial and natural),
fuzzy-neural systems,
methods and applications of evolutionary algorithms,
methods of parallel and mass-parallel computing,
problems of soft-computing,
methods of artificial intelligence.