{"title":"A Survey on BERT and Its Applications","authors":"Sulaiman Aftan, Habib Shah","doi":"10.1109/LT58159.2023.10092289","DOIUrl":null,"url":null,"abstract":"A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning approach that has achieved excellent results in many complex tasks, the same as classification, Natural Language Processing (NLP), prediction, etc. This survey paper mainly adopts the summary of BERT, its multiple types, and its latest developments and applications in various computer science and engineering fields. Furthermore, it puts forward BERT's problems and attractive future research trends in a different area with multiple datasets. From the findings, overall, the BERT and their recent types have achieved more accurate, fast, and optimal results in solving most complex problems than typical Machine and Deep Learning methods.","PeriodicalId":142898,"journal":{"name":"2023 20th Learning and Technology Conference (L&T)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 20th Learning and Technology Conference (L&T)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LT58159.2023.10092289","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning approach that has achieved excellent results in many complex tasks, the same as classification, Natural Language Processing (NLP), prediction, etc. This survey paper mainly adopts the summary of BERT, its multiple types, and its latest developments and applications in various computer science and engineering fields. Furthermore, it puts forward BERT's problems and attractive future research trends in a different area with multiple datasets. From the findings, overall, the BERT and their recent types have achieved more accurate, fast, and optimal results in solving most complex problems than typical Machine and Deep Learning methods.