A Survey on BERT and Its Applications

Sulaiman Aftan, Habib Shah
{"title":"A Survey on BERT and Its Applications","authors":"Sulaiman Aftan, Habib Shah","doi":"10.1109/LT58159.2023.10092289","DOIUrl":null,"url":null,"abstract":"A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning approach that has achieved excellent results in many complex tasks, the same as classification, Natural Language Processing (NLP), prediction, etc. This survey paper mainly adopts the summary of BERT, its multiple types, and its latest developments and applications in various computer science and engineering fields. Furthermore, it puts forward BERT's problems and attractive future research trends in a different area with multiple datasets. From the findings, overall, the BERT and their recent types have achieved more accurate, fast, and optimal results in solving most complex problems than typical Machine and Deep Learning methods.","PeriodicalId":142898,"journal":{"name":"2023 20th Learning and Technology Conference (L&T)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 20th Learning and Technology Conference (L&T)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LT58159.2023.10092289","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

A recently developed language representation model named Bidirectional Encoder Representation from Transformers (BERT) is based on an advanced trained deep learning approach that has achieved excellent results in many complex tasks, the same as classification, Natural Language Processing (NLP), prediction, etc. This survey paper mainly adopts the summary of BERT, its multiple types, and its latest developments and applications in various computer science and engineering fields. Furthermore, it puts forward BERT's problems and attractive future research trends in a different area with multiple datasets. From the findings, overall, the BERT and their recent types have achieved more accurate, fast, and optimal results in solving most complex problems than typical Machine and Deep Learning methods.
BERT及其应用综述
最近开发的一种名为双向编码器表示的语言表示模型(BERT)是基于一种先进的训练有素的深度学习方法,该方法在许多复杂的任务中取得了优异的结果,如分类、自然语言处理(NLP)、预测等。本文主要对BERT、BERT的多种类型以及BERT在计算机科学和工程各个领域的最新发展和应用进行了综述。并在多数据集的不同领域提出了BERT存在的问题和有吸引力的未来研究趋势。从研究结果来看,总体而言,BERT及其最近的类型在解决大多数复杂问题方面取得了比典型的机器和深度学习方法更准确、更快速和更优的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信