Knowledge Graph Embedding Using a Multi-Channel Interactive Convolutional Neural Network with Triple Attention

IF 4.6 Q2 MATERIALS SCIENCE, BIOMATERIALS
Lin Shi, Weitao Liu, Yafeng Wu, Chenxu Dai, Zhanlin Ji, Ivan Ganchev
{"title":"Knowledge Graph Embedding Using a Multi-Channel Interactive Convolutional Neural Network with Triple Attention","authors":"Lin Shi, Weitao Liu, Yafeng Wu, Chenxu Dai, Zhanlin Ji, Ivan Ganchev","doi":"10.3390/math12182821","DOIUrl":null,"url":null,"abstract":"Knowledge graph embedding (KGE) has been identified as an effective method for link prediction, which involves predicting missing relations or entities based on existing entities or relations. KGE is an important method for implementing knowledge representation and, as such, has been widely used in driving intelligent applications w.r.t. question-answering systems, recommendation systems, and relationship extraction. Models based on convolutional neural networks (CNNs) have achieved good results in link prediction. However, as the coverage areas of knowledge graphs expand, the increasing volume of information significantly limits the performance of these models. This article introduces a triple-attention-based multi-channel CNN model, named ConvAMC, for the KGE task. In the embedding representation module, entities and relations are embedded into a complex space and the embeddings are performed in an alternating pattern. This approach helps in capturing richer semantic information and enhances the expressive power of the model. In the encoding module, a multi-channel approach is employed to extract more comprehensive interaction features. A triple attention mechanism and max pooling layers are used to ensure that interactions between spatial dimensions and output tensors are captured during the subsequent tensor concatenation and reshaping process, which allows preserving local and detailed information. Finally, feature vectors are transformed into prediction targets for embedding through the Hadamard product of feature mapping and reshaping matrices. Extensive experiments were conducted to evaluate the performance of ConvAMC on three benchmark datasets compared with state-of-the-art (SOTA) models, demonstrating that the proposed model outperforms all compared models across all evaluation metrics on two of the datasets, and achieves advanced link prediction results on most evaluation metrics on the third dataset.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.3390/math12182821","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
引用次数: 0

Abstract

Knowledge graph embedding (KGE) has been identified as an effective method for link prediction, which involves predicting missing relations or entities based on existing entities or relations. KGE is an important method for implementing knowledge representation and, as such, has been widely used in driving intelligent applications w.r.t. question-answering systems, recommendation systems, and relationship extraction. Models based on convolutional neural networks (CNNs) have achieved good results in link prediction. However, as the coverage areas of knowledge graphs expand, the increasing volume of information significantly limits the performance of these models. This article introduces a triple-attention-based multi-channel CNN model, named ConvAMC, for the KGE task. In the embedding representation module, entities and relations are embedded into a complex space and the embeddings are performed in an alternating pattern. This approach helps in capturing richer semantic information and enhances the expressive power of the model. In the encoding module, a multi-channel approach is employed to extract more comprehensive interaction features. A triple attention mechanism and max pooling layers are used to ensure that interactions between spatial dimensions and output tensors are captured during the subsequent tensor concatenation and reshaping process, which allows preserving local and detailed information. Finally, feature vectors are transformed into prediction targets for embedding through the Hadamard product of feature mapping and reshaping matrices. Extensive experiments were conducted to evaluate the performance of ConvAMC on three benchmark datasets compared with state-of-the-art (SOTA) models, demonstrating that the proposed model outperforms all compared models across all evaluation metrics on two of the datasets, and achieves advanced link prediction results on most evaluation metrics on the third dataset.
使用具有三重关注的多通道交互式卷积神经网络嵌入知识图谱
知识图嵌入(KGE)已被确定为链接预测的有效方法,它涉及根据现有实体或关系预测缺失的关系或实体。知识图嵌入是实现知识表示的一种重要方法,因此已被广泛用于推动问题解答系统、推荐系统和关系提取等智能应用。基于卷积神经网络(CNN)的模型在链接预测方面取得了良好的效果。然而,随着知识图谱覆盖范围的扩大,不断增加的信息量极大地限制了这些模型的性能。本文针对知识图谱任务介绍了一种基于三重关注的多通道 CNN 模型,命名为 ConvAMC。在嵌入表示模块中,实体和关系被嵌入到一个复杂空间中,并以交替模式进行嵌入。这种方法有助于捕捉更丰富的语义信息,增强模型的表现力。在编码模块中,采用了多通道方法来提取更全面的交互特征。三重关注机制和最大池化层用于确保在后续的张量连接和重塑过程中捕捉空间维度和输出张量之间的交互,从而保留局部和细节信息。最后,通过特征映射和重塑矩阵的哈达玛乘积,将特征向量转化为预测目标进行嵌入。在三个基准数据集上对 ConvAMC 的性能进行了广泛的实验评估,并与最先进的(SOTA)模型进行了比较,结果表明,在其中两个数据集上的所有评价指标上,所提出的模型都优于所有比较过的模型,而在第三个数据集上的大多数评价指标上,所提出的模型都取得了先进的链接预测结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
ACS Applied Bio Materials
ACS Applied Bio Materials Chemistry-Chemistry (all)
CiteScore
9.40
自引率
2.10%
发文量
464
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信