Muhammad Talha Riaz, M. Shah Jahan, S. G. Khawaja, A. Shaukat, Jahan Zeb
{"title":"TM-BERT:用于对Covid-19疫苗接种推文进行情绪分析的Twitter改进BERT","authors":"Muhammad Talha Riaz, M. Shah Jahan, S. G. Khawaja, A. Shaukat, Jahan Zeb","doi":"10.1109/ICoDT255437.2022.9787395","DOIUrl":null,"url":null,"abstract":"In transfer learning a model is pre-trained on a large unsupervised dataset and then fine-tuned on domain-specific downstream tasks. BERT is the first true-natured deep bidirectional language model which reads the input from both sides of input to better understand the context of a sentence by solely relying on the Attention mechanism. This study presents a Twitter Modified BERT (TM-BERT) based upon Transformer architecture. It has also developed a new Covid-19 Vaccination Sentiment Analysis Task (CV-SAT) and a COVID-19 unsupervised pre-training dataset containing (70K) tweets. BERT achieved (0.70) and (0.76) accuracy when fine-tuned on CV-SAT, whereas TM-BERT achieved (0.89), a (19%) and (13%) accuracy over BERT. Another enhancement introduced is in terms of time efficiency as BERT takes (64) hours of pre-training while TM-BERT takes only (17) hours and still produces (19%) improvement even after pre-trained on four (4) times fewer data.","PeriodicalId":291030,"journal":{"name":"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"TM-BERT: A Twitter Modified BERT for Sentiment Analysis on Covid-19 Vaccination Tweets\",\"authors\":\"Muhammad Talha Riaz, M. Shah Jahan, S. G. Khawaja, A. Shaukat, Jahan Zeb\",\"doi\":\"10.1109/ICoDT255437.2022.9787395\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In transfer learning a model is pre-trained on a large unsupervised dataset and then fine-tuned on domain-specific downstream tasks. BERT is the first true-natured deep bidirectional language model which reads the input from both sides of input to better understand the context of a sentence by solely relying on the Attention mechanism. This study presents a Twitter Modified BERT (TM-BERT) based upon Transformer architecture. It has also developed a new Covid-19 Vaccination Sentiment Analysis Task (CV-SAT) and a COVID-19 unsupervised pre-training dataset containing (70K) tweets. BERT achieved (0.70) and (0.76) accuracy when fine-tuned on CV-SAT, whereas TM-BERT achieved (0.89), a (19%) and (13%) accuracy over BERT. Another enhancement introduced is in terms of time efficiency as BERT takes (64) hours of pre-training while TM-BERT takes only (17) hours and still produces (19%) improvement even after pre-trained on four (4) times fewer data.\",\"PeriodicalId\":291030,\"journal\":{\"name\":\"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)\",\"volume\":\"29 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICoDT255437.2022.9787395\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Conference on Digital Futures and Transformative Technologies (ICoDT2)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICoDT255437.2022.9787395","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
TM-BERT: A Twitter Modified BERT for Sentiment Analysis on Covid-19 Vaccination Tweets
In transfer learning a model is pre-trained on a large unsupervised dataset and then fine-tuned on domain-specific downstream tasks. BERT is the first true-natured deep bidirectional language model which reads the input from both sides of input to better understand the context of a sentence by solely relying on the Attention mechanism. This study presents a Twitter Modified BERT (TM-BERT) based upon Transformer architecture. It has also developed a new Covid-19 Vaccination Sentiment Analysis Task (CV-SAT) and a COVID-19 unsupervised pre-training dataset containing (70K) tweets. BERT achieved (0.70) and (0.76) accuracy when fine-tuned on CV-SAT, whereas TM-BERT achieved (0.89), a (19%) and (13%) accuracy over BERT. Another enhancement introduced is in terms of time efficiency as BERT takes (64) hours of pre-training while TM-BERT takes only (17) hours and still produces (19%) improvement even after pre-trained on four (4) times fewer data.