{"title":"使用带有深度上下文特征的微调 BERT 模型进行基于方面的情感分析","authors":"Abraham Rajan, Manohar Manur","doi":"10.11591/ijai.v13.i2.pp1250-1261","DOIUrl":null,"url":null,"abstract":"Sentiment analysis is the task of analysing, processing, inferencing and concluding the subjective texts along with sentiment. Considering the application of sentiment analysis, it is categorized into document-level, sentence-level and aspect level. In past, several researches have achieved solutions through the bidirectional encoder representations from transformers (BERT) model, however, the existing model does not understand the context of the aspect in deep, which leads to low metrics. This research work leads to the study of the aspect-based sentiment analysis presented by deep context bidirectional encoder representations from transformers (DC-BERT), main aim of the DC-BERT model is to improvise the context understating for aspects to enhance the metrics. DC-BERT model comprises fine-tuned BERT model along with a deep context features layer, which enables the model to understand the context of targeted aspects deeply. A customized feature layer is introduced to extract two distinctive features, later both features are integrated through the interaction layer. DC-BERT mode is evaluated considering the review dataset of laptops and restaurants from SemEval 2014 task 4, evaluation is carried out considering the different metrics. In comparison with the other model, DC-BERT achieves an accuracy of 84.48% and 92.86% for laptop and restaurant datasets respectively.","PeriodicalId":507934,"journal":{"name":"IAES International Journal of Artificial Intelligence (IJ-AI)","volume":"1 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Aspect based sentiment analysis using fine-tuned BERT model with deep context features\",\"authors\":\"Abraham Rajan, Manohar Manur\",\"doi\":\"10.11591/ijai.v13.i2.pp1250-1261\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment analysis is the task of analysing, processing, inferencing and concluding the subjective texts along with sentiment. Considering the application of sentiment analysis, it is categorized into document-level, sentence-level and aspect level. In past, several researches have achieved solutions through the bidirectional encoder representations from transformers (BERT) model, however, the existing model does not understand the context of the aspect in deep, which leads to low metrics. This research work leads to the study of the aspect-based sentiment analysis presented by deep context bidirectional encoder representations from transformers (DC-BERT), main aim of the DC-BERT model is to improvise the context understating for aspects to enhance the metrics. DC-BERT model comprises fine-tuned BERT model along with a deep context features layer, which enables the model to understand the context of targeted aspects deeply. A customized feature layer is introduced to extract two distinctive features, later both features are integrated through the interaction layer. DC-BERT mode is evaluated considering the review dataset of laptops and restaurants from SemEval 2014 task 4, evaluation is carried out considering the different metrics. In comparison with the other model, DC-BERT achieves an accuracy of 84.48% and 92.86% for laptop and restaurant datasets respectively.\",\"PeriodicalId\":507934,\"journal\":{\"name\":\"IAES International Journal of Artificial Intelligence (IJ-AI)\",\"volume\":\"1 5\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IAES International Journal of Artificial Intelligence (IJ-AI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.11591/ijai.v13.i2.pp1250-1261\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IAES International Journal of Artificial Intelligence (IJ-AI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11591/ijai.v13.i2.pp1250-1261","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Aspect based sentiment analysis using fine-tuned BERT model with deep context features
Sentiment analysis is the task of analysing, processing, inferencing and concluding the subjective texts along with sentiment. Considering the application of sentiment analysis, it is categorized into document-level, sentence-level and aspect level. In past, several researches have achieved solutions through the bidirectional encoder representations from transformers (BERT) model, however, the existing model does not understand the context of the aspect in deep, which leads to low metrics. This research work leads to the study of the aspect-based sentiment analysis presented by deep context bidirectional encoder representations from transformers (DC-BERT), main aim of the DC-BERT model is to improvise the context understating for aspects to enhance the metrics. DC-BERT model comprises fine-tuned BERT model along with a deep context features layer, which enables the model to understand the context of targeted aspects deeply. A customized feature layer is introduced to extract two distinctive features, later both features are integrated through the interaction layer. DC-BERT mode is evaluated considering the review dataset of laptops and restaurants from SemEval 2014 task 4, evaluation is carried out considering the different metrics. In comparison with the other model, DC-BERT achieves an accuracy of 84.48% and 92.86% for laptop and restaurant datasets respectively.