{"title":"Utilizing BERT for Detecting Aspect Categories on TABSA via Adjusting Self-attention among Words","authors":"Hong Hong, Jiawen Song","doi":"10.1109/ICHCI51889.2020.00022","DOIUrl":null,"url":null,"abstract":"Aspect-based sentiment analysis (ABSA) has become a popular research topic in recent years due to its strong function of breaking down text into aspects and identifying sentiment polarity towards a specific target, generating a significant amount of discussion among researchers. Motivated by recent work of application of sentence-pair classification task into ABSA, this article discusses how to further fine-tune the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model and obtain the results on SentiHood dataset. In a contrast to the previous work, this article considers that the sentiment analysis has relations to every single word in each sentence and shows the process of modifying the forward network in BERT to create self-attention between words. The proposed model demonstrates a certain degree of improvement in some aspects, in particular to aspect category detection.","PeriodicalId":355427,"journal":{"name":"2020 International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICHCI51889.2020.00022","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Aspect-based sentiment analysis (ABSA) has become a popular research topic in recent years due to its strong function of breaking down text into aspects and identifying sentiment polarity towards a specific target, generating a significant amount of discussion among researchers. Motivated by recent work of application of sentence-pair classification task into ABSA, this article discusses how to further fine-tune the pre-trained Bidirectional Encoder Representations from Transformers (BERT) model and obtain the results on SentiHood dataset. In a contrast to the previous work, this article considers that the sentiment analysis has relations to every single word in each sentence and shows the process of modifying the forward network in BERT to create self-attention between words. The proposed model demonstrates a certain degree of improvement in some aspects, in particular to aspect category detection.