Sheping Zhai, Wenqing Zhang, Dabao Cheng, Xiaoxia Bai
{"title":"Text Classification Based on Graph Convolution Neural Network and Attention Mechanism","authors":"Sheping Zhai, Wenqing Zhang, Dabao Cheng, Xiaoxia Bai","doi":"10.1145/3573942.3573963","DOIUrl":null,"url":null,"abstract":"Extracting and representing text features is the most important part of text classification. Aiming at the problem of incomplete feature extraction in traditional text classification methods, a text classification model based on graph convolution neural network and attention mechanism is proposed. Firstly, the text is input into BERT (Bi-directional Encoder Representations from Transformers) model to obtain the word vector representation, the context semantic information of the given text is learned by the BiGRU (Bi-directional Gated Recurrent Unit), and the important information is screened by attention mechanism and used as node features. Secondly, the dependency syntax diagram and the corresponding adjacency matrix of the input text are constructed. Thirdly, the GCN (Graph Convolution Neural Network) is used to learn the node features and adjacency matrix. Finally, the obtained text features are input into the classifier for text classification. Experiments on two datasets show that the proposed model achieves a good classification effect, and better accuracy is achieved in comparison with baseline models.","PeriodicalId":103293,"journal":{"name":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2022 5th International Conference on Artificial Intelligence and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3573942.3573963","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Extracting and representing text features is the most important part of text classification. Aiming at the problem of incomplete feature extraction in traditional text classification methods, a text classification model based on graph convolution neural network and attention mechanism is proposed. Firstly, the text is input into BERT (Bi-directional Encoder Representations from Transformers) model to obtain the word vector representation, the context semantic information of the given text is learned by the BiGRU (Bi-directional Gated Recurrent Unit), and the important information is screened by attention mechanism and used as node features. Secondly, the dependency syntax diagram and the corresponding adjacency matrix of the input text are constructed. Thirdly, the GCN (Graph Convolution Neural Network) is used to learn the node features and adjacency matrix. Finally, the obtained text features are input into the classifier for text classification. Experiments on two datasets show that the proposed model achieves a good classification effect, and better accuracy is achieved in comparison with baseline models.