Aspect-level sentiment analysis in social media using a hybrid deep transfer learning approach

IF 7.6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Kia Jahanbin, Mohammed Ali Zare Chahooki
{"title":"Aspect-level sentiment analysis in social media using a hybrid deep transfer learning approach","authors":"Kia Jahanbin,&nbsp;Mohammed Ali Zare Chahooki","doi":"10.1016/j.knosys.2025.114125","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, researchers have become interested in aspect-level sentiment analysis. In the traditional sentiment analysis of documents or sentences, a label was assigned to the entire sentence or document. Whereas a sentence or document can have aspects with different sentiments. Although deep learning models have succeeded in aspect-level sentiment analysis, these models require rich labeled datasets in different domains to extract text features and sentiment analysis. This paper uses deep transfer learning for sentiment analysis of aspect-level sentiment analysis (AHDT) of social network data. The backbone of the AHDT model is a version of RoBERTa’s pre-trained deep neural network specially trained to work on social data. The features extracted from the pre-trained RoBERTa network for sentiment analysis are injected into the Bi-GRU deep neural network and then the attention layer. BI-GRU can process sequences from both sides (left to right and vice versa) and extract hidden relationships. In addition, the attention layer allows the model to pay attention to the more influential aspects of the text and provide a better interpretation. Also, this article uses the Class imbalance method to balance for training the model with almost the same polarities. The test results of the AHDT model on four SemEval datasets for the aspect-sentiment analysis task show that the model has improved the F1-score value in Resturan2014, 2015, and 2016 datasets by 0.63, 27.01, and 15.93, respectively. Also, this model has increased the accuracy value in Resturan2015 and 2016 datasets to 9.21 and 0.54, respectively. In addition, the results of experimental tests in all datasets show that the obtained values of accuracy and F1-score are close to each other, which indicates the stability of the AHDT model.</div></div>","PeriodicalId":49939,"journal":{"name":"Knowledge-Based Systems","volume":"330 ","pages":"Article 114125"},"PeriodicalIF":7.6000,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Knowledge-Based Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0950705125011669","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, researchers have become interested in aspect-level sentiment analysis. In the traditional sentiment analysis of documents or sentences, a label was assigned to the entire sentence or document. Whereas a sentence or document can have aspects with different sentiments. Although deep learning models have succeeded in aspect-level sentiment analysis, these models require rich labeled datasets in different domains to extract text features and sentiment analysis. This paper uses deep transfer learning for sentiment analysis of aspect-level sentiment analysis (AHDT) of social network data. The backbone of the AHDT model is a version of RoBERTa’s pre-trained deep neural network specially trained to work on social data. The features extracted from the pre-trained RoBERTa network for sentiment analysis are injected into the Bi-GRU deep neural network and then the attention layer. BI-GRU can process sequences from both sides (left to right and vice versa) and extract hidden relationships. In addition, the attention layer allows the model to pay attention to the more influential aspects of the text and provide a better interpretation. Also, this article uses the Class imbalance method to balance for training the model with almost the same polarities. The test results of the AHDT model on four SemEval datasets for the aspect-sentiment analysis task show that the model has improved the F1-score value in Resturan2014, 2015, and 2016 datasets by 0.63, 27.01, and 15.93, respectively. Also, this model has increased the accuracy value in Resturan2015 and 2016 datasets to 9.21 and 0.54, respectively. In addition, the results of experimental tests in all datasets show that the obtained values of accuracy and F1-score are close to each other, which indicates the stability of the AHDT model.
使用混合深度迁移学习方法的社交媒体方面级情感分析
近年来,研究人员对方面级情感分析产生了兴趣。在传统的文档或句子情感分析中,为整个句子或文档分配一个标签。然而,一个句子或文件可以有不同的情感方面。虽然深度学习模型在方面级情感分析方面取得了成功,但这些模型需要不同领域丰富的标记数据集来提取文本特征和情感分析。本文将深度迁移学习用于社交网络数据的层面情感分析(AHDT)的情感分析。AHDT模型的主干是RoBERTa预先训练的深度神经网络的一个版本,专门用于处理社交数据。从预训练的RoBERTa网络中提取用于情感分析的特征注入Bi-GRU深度神经网络,然后注入注意层。BI-GRU可以从两边处理序列(从左到右,反之亦然),并提取隐藏的关系。此外,注意层允许模型关注文本中更有影响力的方面,并提供更好的解释。此外,本文还使用类不平衡方法来平衡训练具有几乎相同极性的模型。AHDT模型在四个SemEval数据集上对方面情感分析任务的测试结果表明,该模型将Resturan2014、2015和2016数据集的f1分值分别提高了0.63、27.01和15.93。此外,该模型将Resturan2015和2016数据集的准确率值分别提高到9.21和0.54。此外,在所有数据集上的实验测试结果表明,得到的精度值和F1-score值接近,表明AHDT模型的稳定性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Knowledge-Based Systems
Knowledge-Based Systems 工程技术-计算机:人工智能
CiteScore
14.80
自引率
12.50%
发文量
1245
审稿时长
7.8 months
期刊介绍: Knowledge-Based Systems, an international and interdisciplinary journal in artificial intelligence, publishes original, innovative, and creative research results in the field. It focuses on knowledge-based and other artificial intelligence techniques-based systems. The journal aims to support human prediction and decision-making through data science and computation techniques, provide a balanced coverage of theory and practical study, and encourage the development and implementation of knowledge-based intelligence models, methods, systems, and software tools. Applications in business, government, education, engineering, and healthcare are emphasized.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信