KG-ZESHEL: Knowledge Graph-Enhanced Zero-Shot Entity Linking

Petar Ristoski, Zhizhong Lin, Qunzhi Zhou
{"title":"KG-ZESHEL: Knowledge Graph-Enhanced Zero-Shot Entity Linking","authors":"Petar Ristoski, Zhizhong Lin, Qunzhi Zhou","doi":"10.1145/3460210.3493549","DOIUrl":null,"url":null,"abstract":"Entity linking is a fundamental task for a successful use of knowledge graphs in many information systems. It maps textual mentions to their corresponding entities in a given knowledge graph. However, with the rapid evolution of knowledge graphs, a large number of entities is continuously added over time. Performing entity linking on new, or unseen, entities poses a great challenge, as standard entity linking approaches require large amounts of labeled data for all new entities, and the underlying model must be regularly updated. To address this challenge, several zero-shot entity linking approaches have been proposed, which don't require additional labeled data to perform entity linking over unseen entities and new domains. Most of these approaches use large language models, such as BERT, to encode the textual description of the mentions and entities in a common embedding space, which allows linking mentions to unseen entities. While such approaches have shown good performance, one big drawback is that they are not able to exploit the entity symbolic information from the knowledge graph, such as entity types, relations, popularity scores and graph embeddings. In this paper, we present KG-ZESHEL, a knowledge graph-enhanced zero-shot entity linking approach, which extends an existing BERT-based zero-shot entity linking approach with mention and entity auxiliary information. Experiments on two benchmark entity linking datasets, show that our proposed approach outperforms the related BERT-based state-of-the-art entity linking models.","PeriodicalId":377331,"journal":{"name":"Proceedings of the 11th on Knowledge Capture Conference","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 11th on Knowledge Capture Conference","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3460210.3493549","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

Entity linking is a fundamental task for a successful use of knowledge graphs in many information systems. It maps textual mentions to their corresponding entities in a given knowledge graph. However, with the rapid evolution of knowledge graphs, a large number of entities is continuously added over time. Performing entity linking on new, or unseen, entities poses a great challenge, as standard entity linking approaches require large amounts of labeled data for all new entities, and the underlying model must be regularly updated. To address this challenge, several zero-shot entity linking approaches have been proposed, which don't require additional labeled data to perform entity linking over unseen entities and new domains. Most of these approaches use large language models, such as BERT, to encode the textual description of the mentions and entities in a common embedding space, which allows linking mentions to unseen entities. While such approaches have shown good performance, one big drawback is that they are not able to exploit the entity symbolic information from the knowledge graph, such as entity types, relations, popularity scores and graph embeddings. In this paper, we present KG-ZESHEL, a knowledge graph-enhanced zero-shot entity linking approach, which extends an existing BERT-based zero-shot entity linking approach with mention and entity auxiliary information. Experiments on two benchmark entity linking datasets, show that our proposed approach outperforms the related BERT-based state-of-the-art entity linking models.
知识图增强零射击实体链接
实体链接是在许多信息系统中成功使用知识图的基本任务。它将文本提及映射到给定知识图中相应的实体。然而,随着知识图谱的快速发展,大量的实体随着时间的推移而不断增加。在新的或不可见的实体上执行实体链接带来了巨大的挑战,因为标准实体链接方法需要为所有新实体提供大量标记数据,并且必须定期更新底层模型。为了解决这一挑战,已经提出了几种零射击实体链接方法,这些方法不需要额外的标记数据来执行未见实体和新域的实体链接。这些方法中的大多数都使用大型语言模型(如BERT)在公共嵌入空间中对提及和实体的文本描述进行编码,从而允许将提及链接到不可见的实体。虽然这些方法表现出了良好的性能,但一个很大的缺点是它们不能利用知识图中的实体符号信息,如实体类型、关系、流行度评分和图嵌入。本文提出了一种知识图增强的零镜头实体链接方法KG-ZESHEL,它扩展了现有的基于bert的零镜头实体链接方法,增加了提及信息和实体辅助信息。在两个基准实体链接数据集上的实验表明,我们提出的方法优于相关的基于bert的最先进实体链接模型。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信