{"title":"Common neighbor-aware link weight prediction with simplified graph transformer","authors":"Lizhi Liu","doi":"10.1016/j.asoc.2025.113614","DOIUrl":null,"url":null,"abstract":"<div><div>The link weight prediction holds significant importance in various fields, yet it has been less explored. Building a superior model faces two major challenges. First, the classic graph neural network can only propagate information along the adjacency connections due to the message-passing paradigm. When some edges are unobserved, learning better node representations is hindered. Second, existing methods often condense the local topological patterns into link representations by either graph pooling on enclosing subgraphs or handcrafted feature indices. The former incurs a heavy computational burden while the latter lacks flexibility. To address these challenges, we present a novel link weight prediction algorithm named CoNe. We design a simplified graph Transformer with linear complexity to simultaneously capture local and global topological structure information. Specifically, CoNe leverages a novel simplified global attention mechanism, allowing interactions to no longer be hardwired in static edges but to be flexibly and efficiently extended to arbitrary nodes. Furthermore, we propose self-attentive common neighbor aggregation to embed link heuristics into learnable pairwise representations. Experiments on real-world datasets demonstrate that CoNe outperforms state-of-the-art methods with 0.51%–14.67% improvements.</div></div>","PeriodicalId":50737,"journal":{"name":"Applied Soft Computing","volume":"182 ","pages":"Article 113614"},"PeriodicalIF":7.2000,"publicationDate":"2025-07-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Soft Computing","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1568494625009251","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
The link weight prediction holds significant importance in various fields, yet it has been less explored. Building a superior model faces two major challenges. First, the classic graph neural network can only propagate information along the adjacency connections due to the message-passing paradigm. When some edges are unobserved, learning better node representations is hindered. Second, existing methods often condense the local topological patterns into link representations by either graph pooling on enclosing subgraphs or handcrafted feature indices. The former incurs a heavy computational burden while the latter lacks flexibility. To address these challenges, we present a novel link weight prediction algorithm named CoNe. We design a simplified graph Transformer with linear complexity to simultaneously capture local and global topological structure information. Specifically, CoNe leverages a novel simplified global attention mechanism, allowing interactions to no longer be hardwired in static edges but to be flexibly and efficiently extended to arbitrary nodes. Furthermore, we propose self-attentive common neighbor aggregation to embed link heuristics into learnable pairwise representations. Experiments on real-world datasets demonstrate that CoNe outperforms state-of-the-art methods with 0.51%–14.67% improvements.
期刊介绍:
Applied Soft Computing is an international journal promoting an integrated view of soft computing to solve real life problems.The focus is to publish the highest quality research in application and convergence of the areas of Fuzzy Logic, Neural Networks, Evolutionary Computing, Rough Sets and other similar techniques to address real world complexities.
Applied Soft Computing is a rolling publication: articles are published as soon as the editor-in-chief has accepted them. Therefore, the web site will continuously be updated with new articles and the publication time will be short.