{"title":"Rethinking the Feature Iteration Process of Graph Convolution Networks","authors":"Bisheng Tang, Xiaojun Chen, Dakui Wang, Zhendong Zhao","doi":"10.1109/IJCNN55064.2022.9892737","DOIUrl":null,"url":null,"abstract":"Node classification is a fundamental research problem in graph neural networks(GNNs), which uses node's feature and label to capture node embedding in a low dimension. The existing graph node classification approaches mainly focus on GNNs from global and local perspectives. The relevant research is relatively insufficient for the micro perspective, which refers to the feature itself. In this paper, we prove that deeper GCNs' features will be updated with the same coefficient in the same dimension, limiting deeper GCNs' expression. To overcome the limits of the deeper GCN model, we propose a zero feature (k-ZF) method to train GCNs. Specifically, k-ZF randomly sets the initial k feature value to zero, acting as a data rectifier and augmenter, and is also a skill equipped with GCNs models and other GCNs skills. Extensive experiments based on three public datasets show that k-ZF significantly improves GCNs in the feature aspect and achieves competitive accuracy.","PeriodicalId":106974,"journal":{"name":"2022 International Joint Conference on Neural Networks (IJCNN)","volume":"123 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN55064.2022.9892737","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Node classification is a fundamental research problem in graph neural networks(GNNs), which uses node's feature and label to capture node embedding in a low dimension. The existing graph node classification approaches mainly focus on GNNs from global and local perspectives. The relevant research is relatively insufficient for the micro perspective, which refers to the feature itself. In this paper, we prove that deeper GCNs' features will be updated with the same coefficient in the same dimension, limiting deeper GCNs' expression. To overcome the limits of the deeper GCN model, we propose a zero feature (k-ZF) method to train GCNs. Specifically, k-ZF randomly sets the initial k feature value to zero, acting as a data rectifier and augmenter, and is also a skill equipped with GCNs models and other GCNs skills. Extensive experiments based on three public datasets show that k-ZF significantly improves GCNs in the feature aspect and achieves competitive accuracy.