超关系知识图中的连接多层次轻量级全局层次模型

IF 6.5 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Jiahang Li, Qilong Han, Hui Zhang, Lijie Li, Dan Lu
{"title":"超关系知识图中的连接多层次轻量级全局层次模型","authors":"Jiahang Li,&nbsp;Qilong Han,&nbsp;Hui Zhang,&nbsp;Lijie Li,&nbsp;Dan Lu","doi":"10.1016/j.neucom.2025.131641","DOIUrl":null,"url":null,"abstract":"<div><div>Hyper-relational knowledge graphs enriched by qualifiers have wide applications across diverse fields, and knowledge representation learning is emerging as a prominent research focus. Existing representation methods primarily concentrate on the local hierarchies of each element, overlooking the global hierarchies and their complex dependencies which can result in substantial semantic incompleteness and degraded model generalization. While modeling global hierarchical semantics presents a challenge, integrating local and global hierarchies further increases computational complexity. To tackle these challenges, we propose CMLG, a lightweight global hierarchical representation learning method that connects multiple hierarchies and leverages varied hierarchical details to improve learning effectiveness. Specifically, interactions within local hierarchies are utilized to update the local vectors of triples and qualifiers, thereby capturing essential semantic aggregations at the local hierarchies to construct global hierarchical expressions of entities and relations. These global representations encompass the essential features of hyper-relational facts and are utilized for computational tasks across various domains. To enhance the quality of embeddings, contrastive methods that connect multi-hierarchies are utilized within and across these hierarchies to boost the model’s learning capabilities. Considering the computational resources required for learning at both local and global hierarchies, CMLG adopts the lightweight design to reduce the parameters and computational demands of training, thereby enhancing its suitability for large-scale datasets. Comprehensive experiments on various datasets reveal that our approach outperforms advanced models, achieving up to a 12 % improvement in MRR over the runner-ups.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"657 ","pages":"Article 131641"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Connected multi-hierarchies lightweight global hierarchical model in hyper-relational knowledge graphs\",\"authors\":\"Jiahang Li,&nbsp;Qilong Han,&nbsp;Hui Zhang,&nbsp;Lijie Li,&nbsp;Dan Lu\",\"doi\":\"10.1016/j.neucom.2025.131641\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Hyper-relational knowledge graphs enriched by qualifiers have wide applications across diverse fields, and knowledge representation learning is emerging as a prominent research focus. Existing representation methods primarily concentrate on the local hierarchies of each element, overlooking the global hierarchies and their complex dependencies which can result in substantial semantic incompleteness and degraded model generalization. While modeling global hierarchical semantics presents a challenge, integrating local and global hierarchies further increases computational complexity. To tackle these challenges, we propose CMLG, a lightweight global hierarchical representation learning method that connects multiple hierarchies and leverages varied hierarchical details to improve learning effectiveness. Specifically, interactions within local hierarchies are utilized to update the local vectors of triples and qualifiers, thereby capturing essential semantic aggregations at the local hierarchies to construct global hierarchical expressions of entities and relations. These global representations encompass the essential features of hyper-relational facts and are utilized for computational tasks across various domains. To enhance the quality of embeddings, contrastive methods that connect multi-hierarchies are utilized within and across these hierarchies to boost the model’s learning capabilities. Considering the computational resources required for learning at both local and global hierarchies, CMLG adopts the lightweight design to reduce the parameters and computational demands of training, thereby enhancing its suitability for large-scale datasets. Comprehensive experiments on various datasets reveal that our approach outperforms advanced models, achieving up to a 12 % improvement in MRR over the runner-ups.</div></div>\",\"PeriodicalId\":19268,\"journal\":{\"name\":\"Neurocomputing\",\"volume\":\"657 \",\"pages\":\"Article 131641\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2025-09-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Neurocomputing\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0925231225023136\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225023136","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

由限定词丰富的超关系知识图在各个领域有着广泛的应用,知识表示学习正在成为一个突出的研究热点。现有的表示方法主要关注每个元素的局部层次结构,而忽略了全局层次结构及其复杂的依赖关系,这可能导致严重的语义不完整和模型泛化的降低。虽然建模全局层次语义是一个挑战,但集成局部和全局层次结构进一步增加了计算复杂性。为了应对这些挑战,我们提出了CMLG,一种轻量级的全局分层表示学习方法,它连接多个层次并利用不同的层次细节来提高学习效率。具体而言,利用局部层次内的交互来更新三元组和限定符的局部向量,从而捕获局部层次上的基本语义聚合,以构建实体和关系的全局层次表达。这些全局表示包含超关系事实的基本特征,并用于跨各个领域的计算任务。为了提高嵌入的质量,在这些层次内部和层次之间使用连接多个层次的对比方法来提高模型的学习能力。考虑到局部和全局层次学习所需的计算资源,CMLG采用轻量化设计,减少了训练的参数和计算需求,从而增强了对大规模数据集的适用性。在各种数据集上进行的综合实验表明,我们的方法优于先进的模型,在MRR方面比亚军提高了12%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Connected multi-hierarchies lightweight global hierarchical model in hyper-relational knowledge graphs
Hyper-relational knowledge graphs enriched by qualifiers have wide applications across diverse fields, and knowledge representation learning is emerging as a prominent research focus. Existing representation methods primarily concentrate on the local hierarchies of each element, overlooking the global hierarchies and their complex dependencies which can result in substantial semantic incompleteness and degraded model generalization. While modeling global hierarchical semantics presents a challenge, integrating local and global hierarchies further increases computational complexity. To tackle these challenges, we propose CMLG, a lightweight global hierarchical representation learning method that connects multiple hierarchies and leverages varied hierarchical details to improve learning effectiveness. Specifically, interactions within local hierarchies are utilized to update the local vectors of triples and qualifiers, thereby capturing essential semantic aggregations at the local hierarchies to construct global hierarchical expressions of entities and relations. These global representations encompass the essential features of hyper-relational facts and are utilized for computational tasks across various domains. To enhance the quality of embeddings, contrastive methods that connect multi-hierarchies are utilized within and across these hierarchies to boost the model’s learning capabilities. Considering the computational resources required for learning at both local and global hierarchies, CMLG adopts the lightweight design to reduce the parameters and computational demands of training, thereby enhancing its suitability for large-scale datasets. Comprehensive experiments on various datasets reveal that our approach outperforms advanced models, achieving up to a 12 % improvement in MRR over the runner-ups.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Neurocomputing
Neurocomputing 工程技术-计算机:人工智能
CiteScore
13.10
自引率
10.00%
发文量
1382
审稿时长
70 days
期刊介绍: Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信