基于语义交互匹配网络的少镜头知识图补全

IF 2.6 4区 计算机科学 Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS
Pengfei Luo, Xi Zhu, Tong Xu, Yi Zheng, Enhong Chen
{"title":"基于语义交互匹配网络的少镜头知识图补全","authors":"Pengfei Luo, Xi Zhu, Tong Xu, Yi Zheng, Enhong Chen","doi":"https://dl.acm.org/doi/10.1145/3589557","DOIUrl":null,"url":null,"abstract":"<p>The prosperity of knowledge graphs (KG), as well as related downstream applications, have raised the urgent request of knowledge graph completion techniques for fully supporting the KG reasoning tasks, especially under the circumstance of training data scarcity. Though large efforts have been made on solving this challenge via few-shot learning tools, they mainly focus on simply aggregating entity neighbors to represent few-shot references, while the enhancement from latent semantic correlation within neighbors has been largely ignored. To that end, in this paper, we propose a novel few-shot learning solution, named as Semantic Interaction Matching network (SIM), which applies Transformer framework to enhance the entity representation with capturing semantic interaction between entity neighbors. Specifically, we first design entity-relation fusion module to adaptively encode neighbors with incorporating relation representation. Along this line, Transformer layers are integrated to capture latent correlation within neighbors, as well as the semantic diversification of the support set. Finally, a similarity score is attentively estimated with attention mechanism. Extensive experiments on two public benchmark datasets demonstrate that our model outperforms a variety of state-of-the-art methods with a significant margin.</p>","PeriodicalId":50940,"journal":{"name":"ACM Transactions on the Web","volume":"43 27","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2023-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Semantic Interaction Matching Network for Few-shot Knowledge Graph Completion\",\"authors\":\"Pengfei Luo, Xi Zhu, Tong Xu, Yi Zheng, Enhong Chen\",\"doi\":\"https://dl.acm.org/doi/10.1145/3589557\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>The prosperity of knowledge graphs (KG), as well as related downstream applications, have raised the urgent request of knowledge graph completion techniques for fully supporting the KG reasoning tasks, especially under the circumstance of training data scarcity. Though large efforts have been made on solving this challenge via few-shot learning tools, they mainly focus on simply aggregating entity neighbors to represent few-shot references, while the enhancement from latent semantic correlation within neighbors has been largely ignored. To that end, in this paper, we propose a novel few-shot learning solution, named as Semantic Interaction Matching network (SIM), which applies Transformer framework to enhance the entity representation with capturing semantic interaction between entity neighbors. Specifically, we first design entity-relation fusion module to adaptively encode neighbors with incorporating relation representation. Along this line, Transformer layers are integrated to capture latent correlation within neighbors, as well as the semantic diversification of the support set. Finally, a similarity score is attentively estimated with attention mechanism. Extensive experiments on two public benchmark datasets demonstrate that our model outperforms a variety of state-of-the-art methods with a significant margin.</p>\",\"PeriodicalId\":50940,\"journal\":{\"name\":\"ACM Transactions on the Web\",\"volume\":\"43 27\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2023-03-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on the Web\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/https://dl.acm.org/doi/10.1145/3589557\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on the Web","FirstCategoryId":"94","ListUrlMain":"https://doi.org/https://dl.acm.org/doi/10.1145/3589557","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

知识图(KG)及其下游应用的蓬勃发展,对知识图补全技术提出了迫切的要求,以充分支持KG推理任务,特别是在训练数据稀缺的情况下。尽管人们已经通过少量学习工具来解决这一挑战,但它们主要集中在简单地聚合实体邻居来表示少量引用,而在很大程度上忽略了邻居内部潜在语义相关性的增强。为此,在本文中,我们提出了一种新的少量学习方案,称为语义交互匹配网络(SIM),该方案应用Transformer框架通过捕获实体邻居之间的语义交互来增强实体表示。具体而言,我们首先设计实体-关系融合模块,结合关系表示对邻居进行自适应编码。沿着这条线,Transformer层被集成以捕获邻居之间的潜在相关性,以及支持集的语义多样化。最后,利用注意机制对相似性评分进行了集中估计。在两个公共基准数据集上进行的大量实验表明,我们的模型在很大程度上优于各种最先进的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Semantic Interaction Matching Network for Few-shot Knowledge Graph Completion

The prosperity of knowledge graphs (KG), as well as related downstream applications, have raised the urgent request of knowledge graph completion techniques for fully supporting the KG reasoning tasks, especially under the circumstance of training data scarcity. Though large efforts have been made on solving this challenge via few-shot learning tools, they mainly focus on simply aggregating entity neighbors to represent few-shot references, while the enhancement from latent semantic correlation within neighbors has been largely ignored. To that end, in this paper, we propose a novel few-shot learning solution, named as Semantic Interaction Matching network (SIM), which applies Transformer framework to enhance the entity representation with capturing semantic interaction between entity neighbors. Specifically, we first design entity-relation fusion module to adaptively encode neighbors with incorporating relation representation. Along this line, Transformer layers are integrated to capture latent correlation within neighbors, as well as the semantic diversification of the support set. Finally, a similarity score is attentively estimated with attention mechanism. Extensive experiments on two public benchmark datasets demonstrate that our model outperforms a variety of state-of-the-art methods with a significant margin.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ACM Transactions on the Web
ACM Transactions on the Web 工程技术-计算机:软件工程
CiteScore
4.90
自引率
0.00%
发文量
26
审稿时长
7.5 months
期刊介绍: Transactions on the Web (TWEB) is a journal publishing refereed articles reporting the results of research on Web content, applications, use, and related enabling technologies. Topics in the scope of TWEB include but are not limited to the following: Browsers and Web Interfaces; Electronic Commerce; Electronic Publishing; Hypertext and Hypermedia; Semantic Web; Web Engineering; Web Services; and Service-Oriented Computing XML. In addition, papers addressing the intersection of the following broader technologies with the Web are also in scope: Accessibility; Business Services Education; Knowledge Management and Representation; Mobility and pervasive computing; Performance and scalability; Recommender systems; Searching, Indexing, Classification, Retrieval and Querying, Data Mining and Analysis; Security and Privacy; and User Interfaces. Papers discussing specific Web technologies, applications, content generation and management and use are within scope. Also, papers describing novel applications of the web as well as papers on the underlying technologies are welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信