{"title":"Generative and contrastive graph representation learning with message passing","authors":"Ying Tang, Yining Yang, Guodao Sun","doi":"10.1016/j.neunet.2025.107224","DOIUrl":null,"url":null,"abstract":"<div><div>Self-supervised graph representation learning (SSGRL) has emerged as a promising approach for graph embeddings because it does not rely on manual labels. SSGRL methods are generally divided into generative and contrastive approaches. Generative methods often suffer from poor graph quality, while contrastive methods, which compare augmented views, are more resistant to noise. However, the performance of contrastive methods depends heavily on well-designed data augmentation and high-quality negative samples. Pure generative or contrastive methods alone cannot balance both robustness and performance. To address these issues, we propose a self-supervised graph representation learning method that integrates generative and contrastive ideas, namely Contrastive Generative Message Passing Graph Learning (CGMP-GL). CGMP-GL incorporates the concept of contrast into the generative model and message aggregation module, enhancing the discriminability of node representations by aligning positive samples and separating negative samples. On one hand, CGMP-GL integrates multi-granularity topology and feature information through cross-view multi-level contrast while reconstructing masked node features. On the other hand, CGMP-GL optimizes node representations through self-supervised contrastive message passing, thereby enhancing model performance in various downstream tasks. Extensive experiments over multiple datasets and downstream tasks demonstrate the effectiveness and robustness of CGMP-GL.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"185 ","pages":"Article 107224"},"PeriodicalIF":6.0000,"publicationDate":"2025-02-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025001030","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Self-supervised graph representation learning (SSGRL) has emerged as a promising approach for graph embeddings because it does not rely on manual labels. SSGRL methods are generally divided into generative and contrastive approaches. Generative methods often suffer from poor graph quality, while contrastive methods, which compare augmented views, are more resistant to noise. However, the performance of contrastive methods depends heavily on well-designed data augmentation and high-quality negative samples. Pure generative or contrastive methods alone cannot balance both robustness and performance. To address these issues, we propose a self-supervised graph representation learning method that integrates generative and contrastive ideas, namely Contrastive Generative Message Passing Graph Learning (CGMP-GL). CGMP-GL incorporates the concept of contrast into the generative model and message aggregation module, enhancing the discriminability of node representations by aligning positive samples and separating negative samples. On one hand, CGMP-GL integrates multi-granularity topology and feature information through cross-view multi-level contrast while reconstructing masked node features. On the other hand, CGMP-GL optimizes node representations through self-supervised contrastive message passing, thereby enhancing model performance in various downstream tasks. Extensive experiments over multiple datasets and downstream tasks demonstrate the effectiveness and robustness of CGMP-GL.
期刊介绍:
Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.