Lu Wei, Yiting Liu, Kaiyuan Feng, Jianzhao Li, Kai Sheng, Yue Wu
{"title":"Graph Convolutional Neural Network with Inter-layer Cascade Based on Attention Mechanism","authors":"Lu Wei, Yiting Liu, Kaiyuan Feng, Jianzhao Li, Kai Sheng, Yue Wu","doi":"10.1109/CCIS53392.2021.9754620","DOIUrl":null,"url":null,"abstract":"In recent years, graph data in the non-Euclidean space has been widely used, and the methods and techniques for learning graph data in many deep learning fields have been continuously developed, such as the Graph neural network (GNN). The structural characteristics of graph data, the aggregation mode and representation mode of node information, and the node neighbor information are the core issues of GNN. However, most of the existing graph convolutional neural networks have an excessive smoothing problem, which limits the learning ability of the model. Aiming at the over-smoothing problem of the current algorithm, this paper enhances the learning of graph data by improving the expression ability of local information and global features. This paper constructs a cascade structure between graph convolutional layers. This kind of network structure realizes the dense connection of convolutional layers, makes the local feature information is effectively used, and further enhances the graph representation ability. Introduce self-attention and TopK into the Readout module, selectively aggregate and express feature information, and use graph-level information more efficiently. Graph classification is a downstream task to verify the performance of the proposed model. Experimental results prove that this densely structured graph convolutional network can effectively aggregate local node information and global graph-level information.","PeriodicalId":191226,"journal":{"name":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","volume":"125 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 7th International Conference on Cloud Computing and Intelligent Systems (CCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CCIS53392.2021.9754620","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
In recent years, graph data in the non-Euclidean space has been widely used, and the methods and techniques for learning graph data in many deep learning fields have been continuously developed, such as the Graph neural network (GNN). The structural characteristics of graph data, the aggregation mode and representation mode of node information, and the node neighbor information are the core issues of GNN. However, most of the existing graph convolutional neural networks have an excessive smoothing problem, which limits the learning ability of the model. Aiming at the over-smoothing problem of the current algorithm, this paper enhances the learning of graph data by improving the expression ability of local information and global features. This paper constructs a cascade structure between graph convolutional layers. This kind of network structure realizes the dense connection of convolutional layers, makes the local feature information is effectively used, and further enhances the graph representation ability. Introduce self-attention and TopK into the Readout module, selectively aggregate and express feature information, and use graph-level information more efficiently. Graph classification is a downstream task to verify the performance of the proposed model. Experimental results prove that this densely structured graph convolutional network can effectively aggregate local node information and global graph-level information.