{"title":"Remolding Semantic Focus with Dual Attention Mechanism for Aspect-based Sentiment Analysis","authors":"Xingda Li, Yanwei Bao, Min Hu, Fuji Ren","doi":"10.1109/ccis57298.2022.10016391","DOIUrl":null,"url":null,"abstract":"Aspect-based sentiment analysis (ABSA) is an NLP task that classify fine-grained sentiment towards one specific aspect from the same text. While attention mechanism has achieved great success, attaching aspects to abstract sentiment remains challenging. In this paper, we propose dual attention mechanism, a novel method to re-weight the distribution of attention between stack BERT layers, in prompt learning way with pretrained language model BERT. Specifically, after obtaining the most attractive words, the method raises weight of other possible corresponding words and makes model consider more comprehensively. To introduce more aspect information, we classify the sentiment in improved prompt learning way. Note that the overfitting using BERT on ABSA, we utilize the approach of staged loss that restrict the training not too small. Finally, the experiment results demonstrate the effectiveness and the stability of dual attention and provide a good insight of attention mechanism.","PeriodicalId":374660,"journal":{"name":"2022 IEEE 8th International Conference on Cloud Computing and Intelligent Systems (CCIS)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 8th International Conference on Cloud Computing and Intelligent Systems (CCIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ccis57298.2022.10016391","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Aspect-based sentiment analysis (ABSA) is an NLP task that classify fine-grained sentiment towards one specific aspect from the same text. While attention mechanism has achieved great success, attaching aspects to abstract sentiment remains challenging. In this paper, we propose dual attention mechanism, a novel method to re-weight the distribution of attention between stack BERT layers, in prompt learning way with pretrained language model BERT. Specifically, after obtaining the most attractive words, the method raises weight of other possible corresponding words and makes model consider more comprehensively. To introduce more aspect information, we classify the sentiment in improved prompt learning way. Note that the overfitting using BERT on ABSA, we utilize the approach of staged loss that restrict the training not too small. Finally, the experiment results demonstrate the effectiveness and the stability of dual attention and provide a good insight of attention mechanism.