{"title":"A Survey of Transformer-Based Natural Language Processing Models","authors":"鸣姝 赖","doi":"10.12677/airr.2023.123025","DOIUrl":null,"url":null,"abstract":"Natural language processing is a subfield of deep learning in computer science that aims to enable computers to understand, parse, or generate human language (text, audio, etc.). This paper mainly introduces various types of models derived from the Transformer structure in Natural Language Processing (NLP). In recent years, with the rapid development of deep learning technology, the","PeriodicalId":68167,"journal":{"name":"人工智能与机器人研究","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"人工智能与机器人研究","FirstCategoryId":"1093","ListUrlMain":"https://doi.org/10.12677/airr.2023.123025","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Natural language processing is a subfield of deep learning in computer science that aims to enable computers to understand, parse, or generate human language (text, audio, etc.). This paper mainly introduces various types of models derived from the Transformer structure in Natural Language Processing (NLP). In recent years, with the rapid development of deep learning technology, the