{"title":"Cross-view self-supervised heterogeneous graph representation learning","authors":"Danfeng Zhao, Yanhao Chen, Wei Song, Qi He","doi":"10.1016/j.neunet.2025.107681","DOIUrl":null,"url":null,"abstract":"<div><div>Heterogeneous graph neural networks (HGNNs) often face challenges in efficiently integrating information from multiple views, which hinders their ability to fully leverage complex data structures. To overcome this problem, we present an improved graph-level cross-attention mechanism specifically designed to enhance multi-view integration and improve the model's expressiveness in heterogeneous networks. By incorporating random walks, the Katz index, and Transformers, the model captures higher-order semantic relationships between nodes within the meta-path view. Node context information is extracted by decomposing the network and applying the attention mechanism within the network schema view. The improved graph-level cross-attention in the cross-view context adaptively fuses features from both views. Furthermore, a contrastive loss function is employed to select positive samples based on the local connection strength and global centrality of nodes, enhancing the model's robustness. The suggested self-supervised model performs exceptionally well in node classification and clustering tasks, according to experimental data, demonstrating the effectiveness of our method.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107681"},"PeriodicalIF":6.0000,"publicationDate":"2025-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025005611","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Heterogeneous graph neural networks (HGNNs) often face challenges in efficiently integrating information from multiple views, which hinders their ability to fully leverage complex data structures. To overcome this problem, we present an improved graph-level cross-attention mechanism specifically designed to enhance multi-view integration and improve the model's expressiveness in heterogeneous networks. By incorporating random walks, the Katz index, and Transformers, the model captures higher-order semantic relationships between nodes within the meta-path view. Node context information is extracted by decomposing the network and applying the attention mechanism within the network schema view. The improved graph-level cross-attention in the cross-view context adaptively fuses features from both views. Furthermore, a contrastive loss function is employed to select positive samples based on the local connection strength and global centrality of nodes, enhancing the model's robustness. The suggested self-supervised model performs exceptionally well in node classification and clustering tasks, according to experimental data, demonstrating the effectiveness of our method.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.