Multitask Sentiment Analysis and Topic Classification Using BERT

P. Shah, Hiren Patel, Priya Swaminarayan
{"title":"Multitask Sentiment Analysis and Topic Classification Using BERT","authors":"P. Shah, Hiren Patel, Priya Swaminarayan","doi":"10.4108/eetsis.5287","DOIUrl":null,"url":null,"abstract":"In this study, a multitask model is proposed to perform simultaneous news category and sentiment classification of a diverse dataset comprising 3263 news records spanning across eight categories, including environment, health, education, tech, sports, business, lifestyle, and science. Leveraging the power of Bidirectional Encoder Representations from Transformers (BERT), the algorithm demonstrates remarkable results in both tasks. For topic classification, it achieves an accuracy of 98% along with balanced precision and recall, substantiating its proficiency in categorizing news articles. For sentiment analysis, the model maintains strong accuracy at 94%, distinguishing positive from negative sentiment effectively. This multitask approach showcases the model's versatility and its potential to comprehensively understand and classify news articles based on content and sentiment. This multitask model not only enhances classification accuracy but also improves the efficiency of handling extensive news datasets. Consequently, it empowers news agencies, content recommendation systems, and information retrieval services to offer more personalized and pertinent content to their users.","PeriodicalId":155438,"journal":{"name":"ICST Transactions on Scalable Information Systems","volume":"79 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ICST Transactions on Scalable Information Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4108/eetsis.5287","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this study, a multitask model is proposed to perform simultaneous news category and sentiment classification of a diverse dataset comprising 3263 news records spanning across eight categories, including environment, health, education, tech, sports, business, lifestyle, and science. Leveraging the power of Bidirectional Encoder Representations from Transformers (BERT), the algorithm demonstrates remarkable results in both tasks. For topic classification, it achieves an accuracy of 98% along with balanced precision and recall, substantiating its proficiency in categorizing news articles. For sentiment analysis, the model maintains strong accuracy at 94%, distinguishing positive from negative sentiment effectively. This multitask approach showcases the model's versatility and its potential to comprehensively understand and classify news articles based on content and sentiment. This multitask model not only enhances classification accuracy but also improves the efficiency of handling extensive news datasets. Consequently, it empowers news agencies, content recommendation systems, and information retrieval services to offer more personalized and pertinent content to their users.
使用 BERT 进行多任务情感分析和主题分类
本研究提出了一种多任务模型,可同时对由 3263 条新闻记录组成的多样化数据集进行新闻类别和情感分类,这些新闻记录横跨环境、健康、教育、科技、体育、商业、生活方式和科学等八个类别。利用变压器双向编码器表示(BERT)的强大功能,该算法在这两项任务中都取得了显著的成果。在主题分类方面,它的准确率达到 98%,精确度和召回率也达到了平衡,证明了它在新闻文章分类方面的能力。在情感分析方面,该模型的准确率高达 94%,能有效区分正面和负面情感。这种多任务方法展示了该模型的多功能性及其根据内容和情感对新闻文章进行全面理解和分类的潜力。这种多任务模型不仅提高了分类的准确性,还提高了处理大量新闻数据集的效率。因此,它能帮助新闻机构、内容推荐系统和信息检索服务为用户提供更加个性化和相关的内容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信