Yili Wang, Jiamin Chen, Qiutong Li, Changlong He, Jianliang Gao
{"title":"MSLS: Meta-graph Search with Learnable Supernet for Heterogeneous Graph Neural Networks","authors":"Yili Wang, Jiamin Chen, Qiutong Li, Changlong He, Jianliang Gao","doi":"10.1145/3603719.3603727","DOIUrl":null,"url":null,"abstract":"In recent years, heterogeneous graph neural networks (HGNNs) have achieved excellent performance. The efficient HGNNs consist of meta-graphs and aggregation operations. Since manually designing meta-graph is an expert-dependent and time-consuming process, the performance of HGNNs is limited. To address this challenge, the differentiable meta-graph search has been proposed to obtain promising meta-graph automatically. However, the previous differentiable meta-graph search constructs the supernet without learnable aggregation operations, which limits the semantics extracting ability of HGNNs with automatically designed meta-graph for downstream tasks. To solve this problem, we propose the Meta-graph Search with Learnable Supernet for Heterogeneous Graph Neural Networks (MSLS). Specifically, to obtain better performance HGNNs, MSLS constructs a supernet with learnable aggregation operations based on the meta-graphs. MSLS adopts decoupling training to train the learnable supernet and obtains the optimal meta-graph with learnable aggregation operations using a constrained evolution strategy. Extensive experiments show that our method (MSLS) achieves the best performance in different tasks.","PeriodicalId":314512,"journal":{"name":"Proceedings of the 35th International Conference on Scientific and Statistical Database Management","volume":"164 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 35th International Conference on Scientific and Statistical Database Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3603719.3603727","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
In recent years, heterogeneous graph neural networks (HGNNs) have achieved excellent performance. The efficient HGNNs consist of meta-graphs and aggregation operations. Since manually designing meta-graph is an expert-dependent and time-consuming process, the performance of HGNNs is limited. To address this challenge, the differentiable meta-graph search has been proposed to obtain promising meta-graph automatically. However, the previous differentiable meta-graph search constructs the supernet without learnable aggregation operations, which limits the semantics extracting ability of HGNNs with automatically designed meta-graph for downstream tasks. To solve this problem, we propose the Meta-graph Search with Learnable Supernet for Heterogeneous Graph Neural Networks (MSLS). Specifically, to obtain better performance HGNNs, MSLS constructs a supernet with learnable aggregation operations based on the meta-graphs. MSLS adopts decoupling training to train the learnable supernet and obtains the optimal meta-graph with learnable aggregation operations using a constrained evolution strategy. Extensive experiments show that our method (MSLS) achieves the best performance in different tasks.