Integrating Bi-Dynamic Routing Capsule Network with Label-Constraint for Text classification

Xiang Guo, Youquan Wang, Kaiyuan Gao, Jie Cao, Haicheng Tao, Chaoyue Chen
{"title":"Integrating Bi-Dynamic Routing Capsule Network with Label-Constraint for Text classification","authors":"Xiang Guo, Youquan Wang, Kaiyuan Gao, Jie Cao, Haicheng Tao, Chaoyue Chen","doi":"10.1109/ICBK50248.2020.00011","DOIUrl":null,"url":null,"abstract":"Neural-based text classification methods have attracted increasing attention in recent years. Unlike the standard text classification methods, neural-based text classification methods perform the representation operation and end-to-end learning on the text data. Many useful insights can be derived from neural based text classifiers as demonstrated by an ever-growing body of work focused on text mining. However, in the real-world, text can be both complex and noisy which can pose a problem for effective text classification. An effective way to deal with this issue is to incorporate self-attention and capsule networks into text mining solutions. In this paper, we propose a Bi-dynamic routing Capsule Network with Label-constraint (BCNL) model for text classification, which moves beyond the limitations of previous methods by automatically learning the task-relevant and label-relevant words of text. Specifically, we use a Bi-LSTM and self-attention with position encoder network to learn text embeddings. Meanwhile, we propose a bi-dynamic routing capsule network with label-constraint to adjust the category distribute of text capsules. Through extensive experiments on four datasets, we observe that our method outperforms state-of-the-art baseline methods.","PeriodicalId":432857,"journal":{"name":"2020 IEEE International Conference on Knowledge Graph (ICKG)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Knowledge Graph (ICKG)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICBK50248.2020.00011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Neural-based text classification methods have attracted increasing attention in recent years. Unlike the standard text classification methods, neural-based text classification methods perform the representation operation and end-to-end learning on the text data. Many useful insights can be derived from neural based text classifiers as demonstrated by an ever-growing body of work focused on text mining. However, in the real-world, text can be both complex and noisy which can pose a problem for effective text classification. An effective way to deal with this issue is to incorporate self-attention and capsule networks into text mining solutions. In this paper, we propose a Bi-dynamic routing Capsule Network with Label-constraint (BCNL) model for text classification, which moves beyond the limitations of previous methods by automatically learning the task-relevant and label-relevant words of text. Specifically, we use a Bi-LSTM and self-attention with position encoder network to learn text embeddings. Meanwhile, we propose a bi-dynamic routing capsule network with label-constraint to adjust the category distribute of text capsules. Through extensive experiments on four datasets, we observe that our method outperforms state-of-the-art baseline methods.
基于标签约束的双动态路由胶囊网络文本分类
近年来,基于神经网络的文本分类方法越来越受到人们的关注。与标准文本分类方法不同,基于神经网络的文本分类方法对文本数据进行表示操作和端到端学习。许多有用的见解可以从基于神经的文本分类器中得到,正如越来越多的关于文本挖掘的工作所证明的那样。然而,在现实世界中,文本可能既复杂又嘈杂,这给有效的文本分类带来了问题。解决这一问题的有效方法是将自关注和胶囊网络结合到文本挖掘解决方案中。本文提出了一种带有标签约束的双动态路由胶囊网络(BCNL)模型用于文本分类,该模型通过自动学习文本的任务相关词和标签相关词,突破了以往方法的局限性。具体来说,我们使用Bi-LSTM和自关注与位置编码器网络来学习文本嵌入。同时,我们提出了一种带有标签约束的双动态路由胶囊网络来调整文本胶囊的类别分布。通过对四个数据集的广泛实验,我们观察到我们的方法优于最先进的基线方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信