Jianliang Gao, Changlong He, Jiamin Chen, Qiutong Li, Yili Wang
{"title":"Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth","authors":"Jianliang Gao, Changlong He, Jiamin Chen, Qiutong Li, Yili Wang","doi":"10.1145/3603719.3603729","DOIUrl":null,"url":null,"abstract":"To alleviate the over-smoothing problem caused by deep graph neural networks, decoupled graph neural networks (DGNNs) are proposed. DGNNs decouple the graph neural network into two atomic operations, the propagation (P) operation and the transformation (T) operation. Since manually designing the architecture of DGNNs is a time-consuming and expert-dependent process, the DF-GNAS method is designed, which can automatically construct the architecture of DGNNs with fixed propagation operation and deep layers. The propagation operation is a key process for DGNNs to aggregate graph structure information. However, DF-GNAS automatically designs DGNN architecture using fixed propagation operation for different graph structures will cause performance loss. Meanwhile, DF-GNAS designs deep DGNNs for graphs with simple distributions, which may lead to overfitting problems. To solve the above challenges, we propose the Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth (DGNAS-PD) method. In DGNAS-PD, we design a DGNN operation space with variable efficient propagation operations in order to better aggregate information on different graph structures. We build an effective genetic search strategy to adaptively design appropriate DGNN depths instead of deep DGNNs for the graph with simple distributions in DGNAS-PD. The experiments on five real-world graphs show that DGNAS-PD outperforms state-of-art baseline methods.","PeriodicalId":314512,"journal":{"name":"Proceedings of the 35th International Conference on Scientific and Statistical Database Management","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 35th International Conference on Scientific and Statistical Database Management","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3603719.3603729","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
To alleviate the over-smoothing problem caused by deep graph neural networks, decoupled graph neural networks (DGNNs) are proposed. DGNNs decouple the graph neural network into two atomic operations, the propagation (P) operation and the transformation (T) operation. Since manually designing the architecture of DGNNs is a time-consuming and expert-dependent process, the DF-GNAS method is designed, which can automatically construct the architecture of DGNNs with fixed propagation operation and deep layers. The propagation operation is a key process for DGNNs to aggregate graph structure information. However, DF-GNAS automatically designs DGNN architecture using fixed propagation operation for different graph structures will cause performance loss. Meanwhile, DF-GNAS designs deep DGNNs for graphs with simple distributions, which may lead to overfitting problems. To solve the above challenges, we propose the Decoupled Graph Neural Architecture Search with Variable Propagation Operation and Appropriate Depth (DGNAS-PD) method. In DGNAS-PD, we design a DGNN operation space with variable efficient propagation operations in order to better aggregate information on different graph structures. We build an effective genetic search strategy to adaptively design appropriate DGNN depths instead of deep DGNNs for the graph with simple distributions in DGNAS-PD. The experiments on five real-world graphs show that DGNAS-PD outperforms state-of-art baseline methods.