利用基于注意力的 RNN 和变换器嵌入对学生反馈进行情感分析

Imad Zyout, Mo’ath Zyout
{"title":"利用基于注意力的 RNN 和变换器嵌入对学生反馈进行情感分析","authors":"Imad Zyout, Mo’ath Zyout","doi":"10.11591/ijai.v13.i2.pp2173-2184","DOIUrl":null,"url":null,"abstract":"Sentiment analysis systems aim to assess people’s opinions across various domains by collecting and categorizing feedback and reviews. In our study, researchers put forward a sentiment analysis system that leverages three distinct embedding techniques: automatic, global vectors (GloVe) for word representation, and bidirectional encoder representations from transformers (BERT). This system features an attention layer, with the best model chosen through rigorous comparisons. In developing the sentiment analysis model, we employed a hybrid dataset comprising students’ feedback and comments. This dataset comprises 3,820 comments, including 2,773 from formal evaluations and 1,047 generated by ChatGPT and prompting engineering. Our main motivation for integrating generative AI was to balance both positive and negative comments. We also explored recurrent neural network (RNN), gated recurrent unit (GRU), long short-term memory (LSTM), and bidirectional long short-term memory (Bi-LSTM), with and without pre-trained GloVe embedding. These techniques produced F-scores ranging from 67% to 69%. On the other hand, the sentiment model based on BERT, particularly its KERAS implementation, achieved higher F-scores ranging from 83% to 87%. The Bi-LSTM architecture outperformed other models and the inclusion of an attention layer further enhanced the performance, resulting in F-scores of 89% and 88% from the Bi-LSTM-BERT sentiment models, respectively.","PeriodicalId":507934,"journal":{"name":"IAES International Journal of Artificial Intelligence (IJ-AI)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sentiment analysis of student feedback using attention-based RNN and transformer embedding\",\"authors\":\"Imad Zyout, Mo’ath Zyout\",\"doi\":\"10.11591/ijai.v13.i2.pp2173-2184\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sentiment analysis systems aim to assess people’s opinions across various domains by collecting and categorizing feedback and reviews. In our study, researchers put forward a sentiment analysis system that leverages three distinct embedding techniques: automatic, global vectors (GloVe) for word representation, and bidirectional encoder representations from transformers (BERT). This system features an attention layer, with the best model chosen through rigorous comparisons. In developing the sentiment analysis model, we employed a hybrid dataset comprising students’ feedback and comments. This dataset comprises 3,820 comments, including 2,773 from formal evaluations and 1,047 generated by ChatGPT and prompting engineering. Our main motivation for integrating generative AI was to balance both positive and negative comments. We also explored recurrent neural network (RNN), gated recurrent unit (GRU), long short-term memory (LSTM), and bidirectional long short-term memory (Bi-LSTM), with and without pre-trained GloVe embedding. These techniques produced F-scores ranging from 67% to 69%. On the other hand, the sentiment model based on BERT, particularly its KERAS implementation, achieved higher F-scores ranging from 83% to 87%. The Bi-LSTM architecture outperformed other models and the inclusion of an attention layer further enhanced the performance, resulting in F-scores of 89% and 88% from the Bi-LSTM-BERT sentiment models, respectively.\",\"PeriodicalId\":507934,\"journal\":{\"name\":\"IAES International Journal of Artificial Intelligence (IJ-AI)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IAES International Journal of Artificial Intelligence (IJ-AI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.11591/ijai.v13.i2.pp2173-2184\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IAES International Journal of Artificial Intelligence (IJ-AI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11591/ijai.v13.i2.pp2173-2184","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

情感分析系统旨在通过收集反馈和评论并对其进行分类,评估人们在不同领域的观点。在我们的研究中,研究人员提出了一种情感分析系统,该系统利用了三种不同的嵌入技术:自动嵌入、用于单词表示的全局向量(GloVe)以及来自变换器的双向编码器表示(BERT)。该系统有一个关注层,通过严格的比较选出最佳模型。在开发情感分析模型时,我们采用了一个由学生反馈和评论组成的混合数据集。该数据集包含 3,820 条评论,其中 2,773 条来自正式评价,1,047 条由 ChatGPT 和提示工程生成。我们整合生成式人工智能的主要动机是平衡正面和负面评论。我们还探索了递归神经网络 (RNN)、门控递归单元 (GRU)、长短期记忆 (LSTM) 和双向长短期记忆 (Bi-LSTM),并使用和不使用预先训练的 GloVe 嵌入。这些技术产生的 F 分数从 67% 到 69% 不等。另一方面,基于 BERT 的情感模型,特别是其 KERAS 实现,取得了更高的 F 分数,从 83% 到 87%。Bi-LSTM 架构的性能优于其他模型,而加入注意力层则进一步提高了性能,因此 Bi-LSTM-BERT 情感模型的 F 分数分别达到了 89% 和 88%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sentiment analysis of student feedback using attention-based RNN and transformer embedding
Sentiment analysis systems aim to assess people’s opinions across various domains by collecting and categorizing feedback and reviews. In our study, researchers put forward a sentiment analysis system that leverages three distinct embedding techniques: automatic, global vectors (GloVe) for word representation, and bidirectional encoder representations from transformers (BERT). This system features an attention layer, with the best model chosen through rigorous comparisons. In developing the sentiment analysis model, we employed a hybrid dataset comprising students’ feedback and comments. This dataset comprises 3,820 comments, including 2,773 from formal evaluations and 1,047 generated by ChatGPT and prompting engineering. Our main motivation for integrating generative AI was to balance both positive and negative comments. We also explored recurrent neural network (RNN), gated recurrent unit (GRU), long short-term memory (LSTM), and bidirectional long short-term memory (Bi-LSTM), with and without pre-trained GloVe embedding. These techniques produced F-scores ranging from 67% to 69%. On the other hand, the sentiment model based on BERT, particularly its KERAS implementation, achieved higher F-scores ranging from 83% to 87%. The Bi-LSTM architecture outperformed other models and the inclusion of an attention layer further enhanced the performance, resulting in F-scores of 89% and 88% from the Bi-LSTM-BERT sentiment models, respectively.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信