Full Attention-Based Bi-GRU Neural Network for News Text Classification

Qinting Tang, Jian Li, Jiayu Chen, Hengtong Lu, Yu Du, Kehan Yang
{"title":"Full Attention-Based Bi-GRU Neural Network for News Text Classification","authors":"Qinting Tang, Jian Li, Jiayu Chen, Hengtong Lu, Yu Du, Kehan Yang","doi":"10.1109/ICCC47050.2019.9064061","DOIUrl":null,"url":null,"abstract":"This paper proposes a novel approach for text classification by using attention mechanism. In recent works, several models based on deep learning with traditional attention mechanism mainly learn the weights of steps in the entire text. However, the information of each step is filtered by the encoder, and the same information has different effects on different steps. This paper proposes a full attention-based bidirectional GRU (Bi-GRU) neural network, which is called FABG. FABG uses a Bi-GRU to learn the semantic information of text, and uses full attention mechanism to learn the weights of previous and current outputs of the Bi-GRU at each step, which enables the representation of each step to obtain the important information and ignore the irrelevant information. Finally, through a pooling layer, we get the representation of the text. Thereby FABG can learn more information, which enhances the effect of text classification. Experiments on the English news dataset agnews and the Chinese news dataset chnews show that FABG achieve better performance than the baselines.","PeriodicalId":6739,"journal":{"name":"2019 IEEE 5th International Conference on Computer and Communications (ICCC)","volume":"1 1","pages":"1970-1974"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE 5th International Conference on Computer and Communications (ICCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCC47050.2019.9064061","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

This paper proposes a novel approach for text classification by using attention mechanism. In recent works, several models based on deep learning with traditional attention mechanism mainly learn the weights of steps in the entire text. However, the information of each step is filtered by the encoder, and the same information has different effects on different steps. This paper proposes a full attention-based bidirectional GRU (Bi-GRU) neural network, which is called FABG. FABG uses a Bi-GRU to learn the semantic information of text, and uses full attention mechanism to learn the weights of previous and current outputs of the Bi-GRU at each step, which enables the representation of each step to obtain the important information and ignore the irrelevant information. Finally, through a pooling layer, we get the representation of the text. Thereby FABG can learn more information, which enhances the effect of text classification. Experiments on the English news dataset agnews and the Chinese news dataset chnews show that FABG achieve better performance than the baselines.
基于全注意力的Bi-GRU神经网络新闻文本分类
本文提出了一种利用注意机制进行文本分类的新方法。在最近的研究中,几种基于传统注意机制的深度学习模型主要是学习整个文本中步骤的权重。但是,每一步的信息都要经过编码器的过滤,相同的信息在不同的步骤上会产生不同的效果。本文提出了一种基于全注意力的双向GRU (Bi-GRU)神经网络,称为FABG。FABG使用Bi-GRU学习文本的语义信息,并使用全注意机制学习每一步Bi-GRU之前和当前输出的权重,使每一步的表示都能获得重要的信息,忽略不相关的信息。最后,通过池化层,得到文本的表示。这样,FABG可以学习到更多的信息,从而提高了文本分类的效果。在英文新闻数据集agnews和中文新闻数据集chnews上的实验表明,FABG的性能优于基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信