利用悬案推进实体对齐:通过优化传输学习和对比学习的结构感知方法

Jin Xu, Yangning Li, Xiangjin Xie, Niu Hu, Yinghui Li, Hai-Tao Zheng, Yong Jiang
{"title":"利用悬案推进实体对齐:通过优化传输学习和对比学习的结构感知方法","authors":"Jin Xu, Yangning Li, Xiangjin Xie, Niu Hu, Yinghui Li, Hai-Tao Zheng, Yong Jiang","doi":"10.1007/s00521-024-10276-1","DOIUrl":null,"url":null,"abstract":"<p>Entity alignment (EA) aims to discover the equivalent entities in different knowledge graphs (KGs), which plays an important role in knowledge engineering. Recently, EA with dangling entities has been proposed as a more realistic setting, which assumes that not all entities have corresponding equivalent entities. In this paper, we focus on this setting. Some work has explored this problem by leveraging translation API, pre-trained word embeddings, and other off-the-shelf tools. However, these approaches over-rely on the side information (e.g., entity names) and fail to work when the side information is absent. On the contrary, they still insufficiently exploit the most fundamental graph structure information in KG. To improve the exploitation of the structural information, we propose a novel entity alignment framework called Structure-aware Wasserstein Graph Contrastive Learning (SWGCL), which is refined on three dimensions: (i) Model. We propose a novel Gated Graph Attention Network to capture local and global graph structure attention. (ii) Training. Two learning objectives: contrastive learning and optimal transport learning, are designed to obtain distinguishable entity representations. (iii) Inference. In the inference phase, a PageRank-based method HOSS (Higher-Order Structural Similarity) is proposed to calculate higher-order graph structural similarity. Extensive experiments on two dangling benchmarks demonstrate that our SWGCL outperforms the current state-of-the-art methods with pure structural information in both traditional (relaxed) and dangling (consolidated) settings.\n</p>","PeriodicalId":18925,"journal":{"name":"Neural Computing and Applications","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-08-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Advancing entity alignment with dangling cases: a structure-aware approach through optimal transport learning and contrastive learning\",\"authors\":\"Jin Xu, Yangning Li, Xiangjin Xie, Niu Hu, Yinghui Li, Hai-Tao Zheng, Yong Jiang\",\"doi\":\"10.1007/s00521-024-10276-1\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Entity alignment (EA) aims to discover the equivalent entities in different knowledge graphs (KGs), which plays an important role in knowledge engineering. Recently, EA with dangling entities has been proposed as a more realistic setting, which assumes that not all entities have corresponding equivalent entities. In this paper, we focus on this setting. Some work has explored this problem by leveraging translation API, pre-trained word embeddings, and other off-the-shelf tools. However, these approaches over-rely on the side information (e.g., entity names) and fail to work when the side information is absent. On the contrary, they still insufficiently exploit the most fundamental graph structure information in KG. To improve the exploitation of the structural information, we propose a novel entity alignment framework called Structure-aware Wasserstein Graph Contrastive Learning (SWGCL), which is refined on three dimensions: (i) Model. We propose a novel Gated Graph Attention Network to capture local and global graph structure attention. (ii) Training. Two learning objectives: contrastive learning and optimal transport learning, are designed to obtain distinguishable entity representations. (iii) Inference. In the inference phase, a PageRank-based method HOSS (Higher-Order Structural Similarity) is proposed to calculate higher-order graph structural similarity. Extensive experiments on two dangling benchmarks demonstrate that our SWGCL outperforms the current state-of-the-art methods with pure structural information in both traditional (relaxed) and dangling (consolidated) settings.\\n</p>\",\"PeriodicalId\":18925,\"journal\":{\"name\":\"Neural Computing and Applications\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neural Computing and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1007/s00521-024-10276-1\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Computing and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1007/s00521-024-10276-1","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

实体对齐(EA)旨在发现不同知识图谱(KG)中的等价实体,在知识工程中发挥着重要作用。最近,有人提出了具有悬空实体的 EA,这是一种更现实的设置,它假定并非所有实体都有对应的等效实体。在本文中,我们将重点讨论这种情况。有些工作利用翻译 API、预训练词嵌入和其他现成工具来探讨这个问题。然而,这些方法过度依赖于侧面信息(如实体名称),当侧面信息缺失时,这些方法就会失效。相反,它们仍然没有充分利用 KG 中最基本的图结构信息。为了提高对结构信息的利用率,我们提出了一种名为结构感知瓦瑟斯坦图对比学习(Structure-aware Wasserstein Graph Contrastive Learning,SWGCL)的新型实体对齐框架,并从三个方面对其进行了改进:(i)模型。我们提出了一个新颖的 "门控图注意网络"(Gated Graph Attention Network)来捕捉局部和全局图结构注意。(ii) 训练。我们设计了两个学习目标:对比学习和最佳传输学习,以获得可区分的实体表征。(iii) 推断。在推理阶段,提出了一种基于 PageRank 的方法 HOSS(高阶结构相似性)来计算高阶图结构相似性。在两个悬垂基准上进行的广泛实验表明,我们的 SWGCL 在传统(松弛)和悬垂(巩固)环境下都优于目前最先进的纯结构信息方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Advancing entity alignment with dangling cases: a structure-aware approach through optimal transport learning and contrastive learning

Advancing entity alignment with dangling cases: a structure-aware approach through optimal transport learning and contrastive learning

Entity alignment (EA) aims to discover the equivalent entities in different knowledge graphs (KGs), which plays an important role in knowledge engineering. Recently, EA with dangling entities has been proposed as a more realistic setting, which assumes that not all entities have corresponding equivalent entities. In this paper, we focus on this setting. Some work has explored this problem by leveraging translation API, pre-trained word embeddings, and other off-the-shelf tools. However, these approaches over-rely on the side information (e.g., entity names) and fail to work when the side information is absent. On the contrary, they still insufficiently exploit the most fundamental graph structure information in KG. To improve the exploitation of the structural information, we propose a novel entity alignment framework called Structure-aware Wasserstein Graph Contrastive Learning (SWGCL), which is refined on three dimensions: (i) Model. We propose a novel Gated Graph Attention Network to capture local and global graph structure attention. (ii) Training. Two learning objectives: contrastive learning and optimal transport learning, are designed to obtain distinguishable entity representations. (iii) Inference. In the inference phase, a PageRank-based method HOSS (Higher-Order Structural Similarity) is proposed to calculate higher-order graph structural similarity. Extensive experiments on two dangling benchmarks demonstrate that our SWGCL outperforms the current state-of-the-art methods with pure structural information in both traditional (relaxed) and dangling (consolidated) settings.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信