{"title":"Dynamic Graph Convolutional Network: A Topology Optimization Perspective","authors":"Bowen Deng, Aimin Jiang","doi":"10.1109/mlsp52302.2021.9596206","DOIUrl":null,"url":null,"abstract":"Recently, graph convolutional networks(GCNs) have drawn increasing attention in many domains, e.g., social networks, recommendation systems. It's known that, in the task of graph node classification, inter-class edges connecting nodes from different categories often degrade the GCN model performance. On the other hand, a stronger intra-class connection in terms of the edge number and edge weights is always beneficial to node classification. Most existing GCN models assume that the topology and edge weights of the underlying graph are both fixed. However, real-world networks are often noisy and incomplete. To take into account such uncertainty in graph topology, we propose in this paper a dynamic graph convolution network (DyGCN), where edge weights are treated as learnable parameters. A novel adaptive edge dropping (AdaDrop) strategy is developed for DyGCN, such that even graph topology can be optimized. DyGCN is also a flexible architecture that can be readily combined with other deep GCN models to cope with the oversmoothness encountered when the network goes very deep. Experimental results demonstrate that the proposed DyGCN and its deep variants can achieve competitive classification accuracy in many datasets.","PeriodicalId":156116,"journal":{"name":"2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/mlsp52302.2021.9596206","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, graph convolutional networks(GCNs) have drawn increasing attention in many domains, e.g., social networks, recommendation systems. It's known that, in the task of graph node classification, inter-class edges connecting nodes from different categories often degrade the GCN model performance. On the other hand, a stronger intra-class connection in terms of the edge number and edge weights is always beneficial to node classification. Most existing GCN models assume that the topology and edge weights of the underlying graph are both fixed. However, real-world networks are often noisy and incomplete. To take into account such uncertainty in graph topology, we propose in this paper a dynamic graph convolution network (DyGCN), where edge weights are treated as learnable parameters. A novel adaptive edge dropping (AdaDrop) strategy is developed for DyGCN, such that even graph topology can be optimized. DyGCN is also a flexible architecture that can be readily combined with other deep GCN models to cope with the oversmoothness encountered when the network goes very deep. Experimental results demonstrate that the proposed DyGCN and its deep variants can achieve competitive classification accuracy in many datasets.