Sentiment Analysis of Chinese Reviews Based on BiTCN-Attention Model

Jiajing Zhang, Tingting Zhang, Jinlan Chen
{"title":"Sentiment Analysis of Chinese Reviews Based on BiTCN-Attention Model","authors":"Jiajing Zhang, Tingting Zhang, Jinlan Chen","doi":"10.1142/s0129054122420138","DOIUrl":null,"url":null,"abstract":"It is of great significance for individuals, enterprises, and government departments to analyze and excavate the sentiment in the comments. Many deep learning models are used for text sentiment analysis, and the BiTCN model has good efficacy on sentiment analysis. However, in the actual semantic expression, the contribution of each word to the sentiment tendency is different, BiTCN treats it fairly and does not pay more attention to the key sentiment words. For this problem, a sentiment analysis model based on the BiTCN-Attention is proposed in this paper. The Self-Attention mechanism and Multi-Head Self-Attention mechanism are added to BiTCN respectively to form BiTCN-SA and BiTCN-MHSA, which improve the weight of sentiment words and the accuracy of feature extraction, to increase the effect of sentiment analysis. The experimental results show that the model accuracies of BiTCN-SA and BiTCN-MHSA in the JingDong commodity review data set are 3.96% and 2.41% higher than that of BiTCN, respectively. In the comment data set of DianPing, the accuracy of BiTCN-SA and BiTCN-MHSA improved by 4.62% and 3.49%, respectively, compared with that of BiTCN.","PeriodicalId":192109,"journal":{"name":"Int. J. Found. Comput. Sci.","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Found. Comput. Sci.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/s0129054122420138","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

It is of great significance for individuals, enterprises, and government departments to analyze and excavate the sentiment in the comments. Many deep learning models are used for text sentiment analysis, and the BiTCN model has good efficacy on sentiment analysis. However, in the actual semantic expression, the contribution of each word to the sentiment tendency is different, BiTCN treats it fairly and does not pay more attention to the key sentiment words. For this problem, a sentiment analysis model based on the BiTCN-Attention is proposed in this paper. The Self-Attention mechanism and Multi-Head Self-Attention mechanism are added to BiTCN respectively to form BiTCN-SA and BiTCN-MHSA, which improve the weight of sentiment words and the accuracy of feature extraction, to increase the effect of sentiment analysis. The experimental results show that the model accuracies of BiTCN-SA and BiTCN-MHSA in the JingDong commodity review data set are 3.96% and 2.41% higher than that of BiTCN, respectively. In the comment data set of DianPing, the accuracy of BiTCN-SA and BiTCN-MHSA improved by 4.62% and 3.49%, respectively, compared with that of BiTCN.
基于bitcn -注意力模型的中文评论情感分析
对于个人、企业和政府部门来说,分析和挖掘评论中的情绪都具有重要意义。许多深度学习模型被用于文本情感分析,BiTCN模型在情感分析上有很好的效果。然而,在实际的语义表达中,每个词对情感倾向的贡献是不同的,BiTCN对其对待比较公平,并没有对关键的情感词给予更多的关注。针对这一问题,本文提出了一种基于BiTCN-Attention的情感分析模型。在BiTCN中分别加入自注意机制和多头自注意机制,形成BiTCN- sa和BiTCN- mhsa,提高了情感词的权重和特征提取的准确性,提高了情感分析的效果。实验结果表明,在京东商品评论数据集中,BiTCN- sa和BiTCN- mhsa的模型准确率分别比BiTCN高3.96%和2.41%。在大众点评的评论数据集中,BiTCN- sa和BiTCN- mhsa的准确率比BiTCN分别提高了4.62%和3.49%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信