{"title":"HKANLP: Link Prediction With Hyperspherical Embeddings and Kolmogorov-Arnold Networks.","authors":"Wenchuan Zhang,Wentao Fan,Weifeng Su,Nizar Bouguila","doi":"10.1109/tnnls.2025.3614341","DOIUrl":null,"url":null,"abstract":"Link prediction (LP) is fundamental to graph-based applications, yet existing graph autoencoders (GAEs) and variational GAEs (VGAEs) often struggle with intrinsic graph properties, particularly the presence of negative eigenvalues in adjacency matrices, which limits their adaptability and predictive performance. To address this limitation, we propose Hyperspherical Kolmogorov-Arnold Networks for LP (HKANLP), a novel framework that combines multiple graph neural network (GNN)-based representation learning strategies with Kolmogorov-Arnold networks (KANs) in a hyperspherical embedding space. Specifically, our model leverages the von Mises-Fisher (vMF) distribution to impose geometric consistency in the latent space and employs KANs as universal function approximators to reconstruct adjacency matrices, thereby mitigating the impact of negative eigenvalues and enhancing spectral diversity. Extensive experiments on homophilous, heterophilous, and large-scale graph datasets demonstrate that HKANLP achieves superior LP performance and robustness compared to state-of-the-art baselines. Furthermore, visualization analyses illustrate the model's effectiveness in capturing complex structural patterns. The source code of our model is publicly available at https://github.com/zxj8806/HKANLP/.","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"53 1","pages":""},"PeriodicalIF":8.9000,"publicationDate":"2025-10-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/tnnls.2025.3614341","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Link prediction (LP) is fundamental to graph-based applications, yet existing graph autoencoders (GAEs) and variational GAEs (VGAEs) often struggle with intrinsic graph properties, particularly the presence of negative eigenvalues in adjacency matrices, which limits their adaptability and predictive performance. To address this limitation, we propose Hyperspherical Kolmogorov-Arnold Networks for LP (HKANLP), a novel framework that combines multiple graph neural network (GNN)-based representation learning strategies with Kolmogorov-Arnold networks (KANs) in a hyperspherical embedding space. Specifically, our model leverages the von Mises-Fisher (vMF) distribution to impose geometric consistency in the latent space and employs KANs as universal function approximators to reconstruct adjacency matrices, thereby mitigating the impact of negative eigenvalues and enhancing spectral diversity. Extensive experiments on homophilous, heterophilous, and large-scale graph datasets demonstrate that HKANLP achieves superior LP performance and robustness compared to state-of-the-art baselines. Furthermore, visualization analyses illustrate the model's effectiveness in capturing complex structural patterns. The source code of our model is publicly available at https://github.com/zxj8806/HKANLP/.
期刊介绍:
The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.