Jiahang Li, Qilong Han, Hui Zhang, Lijie Li, Dan Lu
{"title":"Connected multi-hierarchies lightweight global hierarchical model in hyper-relational knowledge graphs","authors":"Jiahang Li, Qilong Han, Hui Zhang, Lijie Li, Dan Lu","doi":"10.1016/j.neucom.2025.131641","DOIUrl":null,"url":null,"abstract":"<div><div>Hyper-relational knowledge graphs enriched by qualifiers have wide applications across diverse fields, and knowledge representation learning is emerging as a prominent research focus. Existing representation methods primarily concentrate on the local hierarchies of each element, overlooking the global hierarchies and their complex dependencies which can result in substantial semantic incompleteness and degraded model generalization. While modeling global hierarchical semantics presents a challenge, integrating local and global hierarchies further increases computational complexity. To tackle these challenges, we propose CMLG, a lightweight global hierarchical representation learning method that connects multiple hierarchies and leverages varied hierarchical details to improve learning effectiveness. Specifically, interactions within local hierarchies are utilized to update the local vectors of triples and qualifiers, thereby capturing essential semantic aggregations at the local hierarchies to construct global hierarchical expressions of entities and relations. These global representations encompass the essential features of hyper-relational facts and are utilized for computational tasks across various domains. To enhance the quality of embeddings, contrastive methods that connect multi-hierarchies are utilized within and across these hierarchies to boost the model’s learning capabilities. Considering the computational resources required for learning at both local and global hierarchies, CMLG adopts the lightweight design to reduce the parameters and computational demands of training, thereby enhancing its suitability for large-scale datasets. Comprehensive experiments on various datasets reveal that our approach outperforms advanced models, achieving up to a 12 % improvement in MRR over the runner-ups.</div></div>","PeriodicalId":19268,"journal":{"name":"Neurocomputing","volume":"657 ","pages":"Article 131641"},"PeriodicalIF":6.5000,"publicationDate":"2025-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neurocomputing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0925231225023136","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Hyper-relational knowledge graphs enriched by qualifiers have wide applications across diverse fields, and knowledge representation learning is emerging as a prominent research focus. Existing representation methods primarily concentrate on the local hierarchies of each element, overlooking the global hierarchies and their complex dependencies which can result in substantial semantic incompleteness and degraded model generalization. While modeling global hierarchical semantics presents a challenge, integrating local and global hierarchies further increases computational complexity. To tackle these challenges, we propose CMLG, a lightweight global hierarchical representation learning method that connects multiple hierarchies and leverages varied hierarchical details to improve learning effectiveness. Specifically, interactions within local hierarchies are utilized to update the local vectors of triples and qualifiers, thereby capturing essential semantic aggregations at the local hierarchies to construct global hierarchical expressions of entities and relations. These global representations encompass the essential features of hyper-relational facts and are utilized for computational tasks across various domains. To enhance the quality of embeddings, contrastive methods that connect multi-hierarchies are utilized within and across these hierarchies to boost the model’s learning capabilities. Considering the computational resources required for learning at both local and global hierarchies, CMLG adopts the lightweight design to reduce the parameters and computational demands of training, thereby enhancing its suitability for large-scale datasets. Comprehensive experiments on various datasets reveal that our approach outperforms advanced models, achieving up to a 12 % improvement in MRR over the runner-ups.
期刊介绍:
Neurocomputing publishes articles describing recent fundamental contributions in the field of neurocomputing. Neurocomputing theory, practice and applications are the essential topics being covered.