Qinlu Zhao, Xiaodong Cai, Chaocun Chen, Lu Lv, Mingyao Chen
{"title":"基于注意机制的深度神经网络评论内容分类","authors":"Qinlu Zhao, Xiaodong Cai, Chaocun Chen, Lu Lv, Mingyao Chen","doi":"10.1109/IAEAC.2017.8054369","DOIUrl":null,"url":null,"abstract":"It is difficult to fully represent text information with shallow network, and it is time-consuming for using deep neural network. This paper proposes a CNN-Attention network based on Convolutional Neural Network with Attention (CNNA) mechanism. First of all, information between words for context can be expressed by using different sizes of convolution kernels. Secondly, an attention layer is added to convolution network to obtain semantic codes which include the attention probability distribution of input text sequences. Furthermore, weights of text representing information are calculated. Finally, the softmax is used to classify emotional sentences. Experimental results show that features of different context information can be extracted by the method proposed, the depth of the network is reduced and the accuracy effectively is improved at the same time. It also shows improved accuracy in COAE2014 task 4 micro-blog data set for emotional classification up to 95.15%.","PeriodicalId":432109,"journal":{"name":"2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)","volume":"11 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Commented content classification with deep neural network based on attention mechanism\",\"authors\":\"Qinlu Zhao, Xiaodong Cai, Chaocun Chen, Lu Lv, Mingyao Chen\",\"doi\":\"10.1109/IAEAC.2017.8054369\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It is difficult to fully represent text information with shallow network, and it is time-consuming for using deep neural network. This paper proposes a CNN-Attention network based on Convolutional Neural Network with Attention (CNNA) mechanism. First of all, information between words for context can be expressed by using different sizes of convolution kernels. Secondly, an attention layer is added to convolution network to obtain semantic codes which include the attention probability distribution of input text sequences. Furthermore, weights of text representing information are calculated. Finally, the softmax is used to classify emotional sentences. Experimental results show that features of different context information can be extracted by the method proposed, the depth of the network is reduced and the accuracy effectively is improved at the same time. It also shows improved accuracy in COAE2014 task 4 micro-blog data set for emotional classification up to 95.15%.\",\"PeriodicalId\":432109,\"journal\":{\"name\":\"2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)\",\"volume\":\"11 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-03-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IAEAC.2017.8054369\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 2nd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAEAC.2017.8054369","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
摘要
浅层神经网络难以完全表示文本信息,深层神经网络耗时长。本文提出了一种基于Convolutional Neural network with Attention (CNNA)机制的CNN-Attention网络。首先,单词之间的上下文信息可以通过使用不同大小的卷积核来表示。其次,在卷积网络中加入注意层,得到包含输入文本序列注意概率分布的语义码;此外,还计算了表示信息的文本的权重。最后,利用softmax对情感句进行分类。实验结果表明,该方法可以有效地提取不同上下文信息的特征,减少了网络的深度,同时有效地提高了准确率。在COAE2014任务4微博数据集上进行情绪分类准确率达到95.15%。
Commented content classification with deep neural network based on attention mechanism
It is difficult to fully represent text information with shallow network, and it is time-consuming for using deep neural network. This paper proposes a CNN-Attention network based on Convolutional Neural Network with Attention (CNNA) mechanism. First of all, information between words for context can be expressed by using different sizes of convolution kernels. Secondly, an attention layer is added to convolution network to obtain semantic codes which include the attention probability distribution of input text sequences. Furthermore, weights of text representing information are calculated. Finally, the softmax is used to classify emotional sentences. Experimental results show that features of different context information can be extracted by the method proposed, the depth of the network is reduced and the accuracy effectively is improved at the same time. It also shows improved accuracy in COAE2014 task 4 micro-blog data set for emotional classification up to 95.15%.