Shovan Bhowmik, Rifat Sadik, Wahiduzzaman Akanda, Juboraj Roy Pavel
{"title":"Sentiment analysis with hotel customer reviews using FNet","authors":"Shovan Bhowmik, Rifat Sadik, Wahiduzzaman Akanda, Juboraj Roy Pavel","doi":"10.11591/eei.v13i2.6301","DOIUrl":null,"url":null,"abstract":"Recent research has focused on opinion mining from public sentiments using natural language processing (NLP) and machine learning (ML) techniques. Transformer-based models, such as bidirectional encoder representations from transformers (BERT), excel in extracting semantic information but are resourceintensive. Google’s new research, mixing tokens with fourier transform, also known as FNet, replaced BERT’s attention mechanism with a non-parameterized fourier transform, aiming to reduce training time without compromising performance. This study fine-tuned the FNet model with a publicly available Kaggle hotel review dataset and investigated the performance of this dataset in both FNet and BERT architectures along with conventional machine learning models such as long short-term memory (LSTM) and support vector machine (SVM). Results revealed that FNet significantly reduces the training time by almost 20% and memory utilization by nearly 60% compared to BERT. The highest test accuracy observed in this experiment by FNet was 80.27% which is nearly 97.85% of BERT’s performance with identical parameters.","PeriodicalId":37619,"journal":{"name":"Bulletin of Electrical Engineering and Informatics","volume":"46 3","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Bulletin of Electrical Engineering and Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11591/eei.v13i2.6301","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 0
Abstract
Recent research has focused on opinion mining from public sentiments using natural language processing (NLP) and machine learning (ML) techniques. Transformer-based models, such as bidirectional encoder representations from transformers (BERT), excel in extracting semantic information but are resourceintensive. Google’s new research, mixing tokens with fourier transform, also known as FNet, replaced BERT’s attention mechanism with a non-parameterized fourier transform, aiming to reduce training time without compromising performance. This study fine-tuned the FNet model with a publicly available Kaggle hotel review dataset and investigated the performance of this dataset in both FNet and BERT architectures along with conventional machine learning models such as long short-term memory (LSTM) and support vector machine (SVM). Results revealed that FNet significantly reduces the training time by almost 20% and memory utilization by nearly 60% compared to BERT. The highest test accuracy observed in this experiment by FNet was 80.27% which is nearly 97.85% of BERT’s performance with identical parameters.
期刊介绍:
Bulletin of Electrical Engineering and Informatics publishes original papers in the field of electrical, computer and informatics engineering which covers, but not limited to, the following scope: Computer Science, Computer Engineering and Informatics[...] Electronics[...] Electrical and Power Engineering[...] Telecommunication and Information Technology[...]Instrumentation and Control Engineering[...]