Nan Yin;Li Shen;Mengzhu Wang;Xinwang Liu;Chong Chen;Xian-Sheng Hua
{"title":"DREAM: A Dual Variational Framework for Unsupervised Graph Domain Adaptation","authors":"Nan Yin;Li Shen;Mengzhu Wang;Xinwang Liu;Chong Chen;Xian-Sheng Hua","doi":"10.1109/TPAMI.2025.3596054","DOIUrl":null,"url":null,"abstract":"Graph classification has been a prominent problem in graph machine learning fields. This problem has been investigated by leveraging message passing neural networks (MPNNs) to learn powerful graph representations. However, MPNNs extract topological semantics implicitly under label supervision, which could suffer from domain shift and label scarcity in unsupervised domain adaptation settings. In this paper, we propose an effective solution named <underline>D</u>ual Va<underline>r</u>iational S<underline>e</u>mantics Gr<underline>a</u>ph <underline>M</u>ining (DREAM) for unsupervised graph domain adaptation by combining graph structural semantics from complementary perspectives. Besides a message passing branch to learn implicit semantics, our DREAM trains a path aggregation branch, which can provide explicit high-order structural semantics as a supplement. To train these two branches conjointly, we employ an expectation-maximization (EM) style variational framework for the maximization of likelihood. In the E-step, we fix the message passing branch and construct a graph-of-graph to indicate the geometric correlation between source and target domains, which would be adopted for the optimization of the other branch. In the M-step, we train the message passing branch and update the graph neural networks on the graph-of-graph with the other branch fixed. The alternative optimization improves the collaboration of knowledge from two branches. Extensive experiments on several benchmark datasets validate the superiority of the proposed DREAM compared with various baselines.","PeriodicalId":94034,"journal":{"name":"IEEE transactions on pattern analysis and machine intelligence","volume":"47 11","pages":"10787-10800"},"PeriodicalIF":18.6000,"publicationDate":"2025-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on pattern analysis and machine intelligence","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11113417/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Graph classification has been a prominent problem in graph machine learning fields. This problem has been investigated by leveraging message passing neural networks (MPNNs) to learn powerful graph representations. However, MPNNs extract topological semantics implicitly under label supervision, which could suffer from domain shift and label scarcity in unsupervised domain adaptation settings. In this paper, we propose an effective solution named Dual Variational Semantics Graph Mining (DREAM) for unsupervised graph domain adaptation by combining graph structural semantics from complementary perspectives. Besides a message passing branch to learn implicit semantics, our DREAM trains a path aggregation branch, which can provide explicit high-order structural semantics as a supplement. To train these two branches conjointly, we employ an expectation-maximization (EM) style variational framework for the maximization of likelihood. In the E-step, we fix the message passing branch and construct a graph-of-graph to indicate the geometric correlation between source and target domains, which would be adopted for the optimization of the other branch. In the M-step, we train the message passing branch and update the graph neural networks on the graph-of-graph with the other branch fixed. The alternative optimization improves the collaboration of knowledge from two branches. Extensive experiments on several benchmark datasets validate the superiority of the proposed DREAM compared with various baselines.