Geng-jing Chen , Gong-de Guo , S. Lorraine Martin , Hui Wang
{"title":"链接预测的低维交叉注意模型及其在药物再利用中的应用","authors":"Geng-jing Chen , Gong-de Guo , S. Lorraine Martin , Hui Wang","doi":"10.1016/j.knosys.2025.113562","DOIUrl":null,"url":null,"abstract":"<div><div>Link prediction, a key technique for knowledge graph completion, has advanced with transformer-based encoders utilizing high-dimensional embeddings and self-attention mechanisms. However, these approaches often result in models with excessive parameters, poor scalability, and substantial computational demands, limiting their practical applicability. To address these limitations, this paper introduces a low-dimensional link prediction model that leverages cross-attention for improved efficiency and scalability. Our approach employs low-dimensional embeddings to capture essential, non-redundant information about entities and relations, significantly reducing computational and memory requirements. Unlike self-attention, which models interactions within a single set of embeddings, cross-attention in our model captures complex interactions between entities and relations in a compact, low-dimensional space. Additionally, a streamlined decoding method simplifies computations, reducing processing time without compromising accuracy. Experimental results show that our model outperforms most state-of-the-art link prediction models on two public datasets, WN18RR and FB15k-237. Compared to these top-performing methods, our model contains only 18.1 % and 25.4 % of the parameters of these comparable models, while incurring a performance loss of merely 2.4 % and 3.1 %, respectively. Furthermore, it achieves an average 72 % reduction in embedding dimensions compared to five leading models. A case study on drug repurposing further illustrates the model's potential for real-world applications in knowledge graph completion.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"319 ","pages":"Article 113562"},"PeriodicalIF":7.2000,"publicationDate":"2025-04-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A low-dimensional cross-attention model for link prediction with applications to drug repurposing\",\"authors\":\"Geng-jing Chen , Gong-de Guo , S. Lorraine Martin , Hui Wang\",\"doi\":\"10.1016/j.knosys.2025.113562\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Link prediction, a key technique for knowledge graph completion, has advanced with transformer-based encoders utilizing high-dimensional embeddings and self-attention mechanisms. However, these approaches often result in models with excessive parameters, poor scalability, and substantial computational demands, limiting their practical applicability. To address these limitations, this paper introduces a low-dimensional link prediction model that leverages cross-attention for improved efficiency and scalability. Our approach employs low-dimensional embeddings to capture essential, non-redundant information about entities and relations, significantly reducing computational and memory requirements. Unlike self-attention, which models interactions within a single set of embeddings, cross-attention in our model captures complex interactions between entities and relations in a compact, low-dimensional space. Additionally, a streamlined decoding method simplifies computations, reducing processing time without compromising accuracy. Experimental results show that our model outperforms most state-of-the-art link prediction models on two public datasets, WN18RR and FB15k-237. Compared to these top-performing methods, our model contains only 18.1 % and 25.4 % of the parameters of these comparable models, while incurring a performance loss of merely 2.4 % and 3.1 %, respectively. Furthermore, it achieves an average 72 % reduction in embedding dimensions compared to five leading models. A case study on drug repurposing further illustrates the model's potential for real-world applications in knowledge graph completion.</div></div>\",\"PeriodicalId\":49939,\"journal\":{\"name\":\"Knowledge-Based Systems\",\"volume\":\"319 \",\"pages\":\"Article 113562\"},\"PeriodicalIF\":7.2000,\"publicationDate\":\"2025-04-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Knowledge-Based Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0950705125006082\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125006082","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A low-dimensional cross-attention model for link prediction with applications to drug repurposing
Link prediction, a key technique for knowledge graph completion, has advanced with transformer-based encoders utilizing high-dimensional embeddings and self-attention mechanisms. However, these approaches often result in models with excessive parameters, poor scalability, and substantial computational demands, limiting their practical applicability. To address these limitations, this paper introduces a low-dimensional link prediction model that leverages cross-attention for improved efficiency and scalability. Our approach employs low-dimensional embeddings to capture essential, non-redundant information about entities and relations, significantly reducing computational and memory requirements. Unlike self-attention, which models interactions within a single set of embeddings, cross-attention in our model captures complex interactions between entities and relations in a compact, low-dimensional space. Additionally, a streamlined decoding method simplifies computations, reducing processing time without compromising accuracy. Experimental results show that our model outperforms most state-of-the-art link prediction models on two public datasets, WN18RR and FB15k-237. Compared to these top-performing methods, our model contains only 18.1 % and 25.4 % of the parameters of these comparable models, while incurring a performance loss of merely 2.4 % and 3.1 %, respectively. Furthermore, it achieves an average 72 % reduction in embedding dimensions compared to five leading models. A case study on drug repurposing further illustrates the model's potential for real-world applications in knowledge graph completion.
期刊介绍:
Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.