基于词级领域多样性关注的LSTM情感分类模型

Haoliang Zhang, Hongbo Xu, Jinqiao Shi, Tingwen Liu, Chun Liao
{"title":"基于词级领域多样性关注的LSTM情感分类模型","authors":"Haoliang Zhang, Hongbo Xu, Jinqiao Shi, Tingwen Liu, Chun Liao","doi":"10.1109/DSC50466.2020.00032","DOIUrl":null,"url":null,"abstract":"Sentiment classification is an important task in Natural Language Processing research and it has considerable application significance. The complexity of human sentimental opinion implies that the hidden information such as application scenes or domains that behind the text may play an important role in the prediction of sentiment polarity. This paper presents a novel model for Sentiment Classification, Domain-Diversity Attention Mechanism based LSTM Model (DDAM-LSTM), integrating word level domain relevant features into an input side attention mechanism of LSTM model. Firstly, we propose a representing and calculating method of domain relevant features for each word according to its context. Then we find that the common words and certain domain-specific words show obvious different distribution states as for domain tendency. On this basis, an attention mechanism is designed to assign scale weights to the words at the input side of LSTM network according to their diversity of domain tendency. By combining this unique attention mechanism with the LSTM model, we achieve the goal of fusing the implied domain knowledge with the Neural Network. Experimental results on three public benchmark datasets show that our proposed model yields obvious performance improvement.","PeriodicalId":423182,"journal":{"name":"2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC)","volume":"53 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Word Level Domain-Diversity Attention Based LSTM Model for Sentiment Classification\",\"authors\":\"Haoliang Zhang, Hongbo Xu, Jinqiao Shi, Tingwen Liu, Chun Liao\",\"doi\":\"10.1109/DSC50466.2020.00032\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment classification is an important task in Natural Language Processing research and it has considerable application significance. The complexity of human sentimental opinion implies that the hidden information such as application scenes or domains that behind the text may play an important role in the prediction of sentiment polarity. This paper presents a novel model for Sentiment Classification, Domain-Diversity Attention Mechanism based LSTM Model (DDAM-LSTM), integrating word level domain relevant features into an input side attention mechanism of LSTM model. Firstly, we propose a representing and calculating method of domain relevant features for each word according to its context. Then we find that the common words and certain domain-specific words show obvious different distribution states as for domain tendency. On this basis, an attention mechanism is designed to assign scale weights to the words at the input side of LSTM network according to their diversity of domain tendency. By combining this unique attention mechanism with the LSTM model, we achieve the goal of fusing the implied domain knowledge with the Neural Network. Experimental results on three public benchmark datasets show that our proposed model yields obvious performance improvement.\",\"PeriodicalId\":423182,\"journal\":{\"name\":\"2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC)\",\"volume\":\"53 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/DSC50466.2020.00032\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Fifth International Conference on Data Science in Cyberspace (DSC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DSC50466.2020.00032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

情感分类是自然语言处理研究中的一项重要任务,具有重要的应用意义。人类情感观点的复杂性意味着文本背后的应用场景或领域等隐藏信息可能在情感极性预测中发挥重要作用。本文提出了一种新的情感分类模型——基于域多样性注意机制的LSTM模型(DDAM-LSTM),将词级域相关特征集成到LSTM模型的输入侧注意机制中。首先,我们提出了一种根据上下文对每个词进行领域相关特征表示和计算的方法。然后我们发现,普通词汇和特定领域词汇在领域倾向上呈现出明显不同的分布状态。在此基础上,设计了一种注意机制,根据LSTM网络输入端的词的领域倾向多样性,为其分配尺度权重。通过将这种独特的注意力机制与LSTM模型相结合,实现了隐含领域知识与神经网络的融合。在三个公共基准数据集上的实验结果表明,我们提出的模型有明显的性能改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Word Level Domain-Diversity Attention Based LSTM Model for Sentiment Classification
Sentiment classification is an important task in Natural Language Processing research and it has considerable application significance. The complexity of human sentimental opinion implies that the hidden information such as application scenes or domains that behind the text may play an important role in the prediction of sentiment polarity. This paper presents a novel model for Sentiment Classification, Domain-Diversity Attention Mechanism based LSTM Model (DDAM-LSTM), integrating word level domain relevant features into an input side attention mechanism of LSTM model. Firstly, we propose a representing and calculating method of domain relevant features for each word according to its context. Then we find that the common words and certain domain-specific words show obvious different distribution states as for domain tendency. On this basis, an attention mechanism is designed to assign scale weights to the words at the input side of LSTM network according to their diversity of domain tendency. By combining this unique attention mechanism with the LSTM model, we achieve the goal of fusing the implied domain knowledge with the Neural Network. Experimental results on three public benchmark datasets show that our proposed model yields obvious performance improvement.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信