{"title":"Challenges and Opportunities of Language Representation Model","authors":"Ziang Zhou, Ziqian Li, Jiahong Lu","doi":"10.1109/ICHCI51889.2020.00076","DOIUrl":null,"url":null,"abstract":"Pre-trained distribution natural language representation has been shown to be effective for improving many natural language processing tasks. However, recent research shows that the pre-trained language model has obvious defects in robustness, interpretability and so on. This survey reviews the development of natural language representation model in detail, including but not limited to Word2Vec, Embeddings from Language Models, and Bidirectional Encoder Representations from Transformers. Furthermore, this paper analyzes advantage and disadvantages of existing models from multiple perspectives. Finally, the potential challenges of natural language representation model have been discussed in this survey.","PeriodicalId":355427,"journal":{"name":"2020 International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 International Conference on Intelligent Computing and Human-Computer Interaction (ICHCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICHCI51889.2020.00076","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Pre-trained distribution natural language representation has been shown to be effective for improving many natural language processing tasks. However, recent research shows that the pre-trained language model has obvious defects in robustness, interpretability and so on. This survey reviews the development of natural language representation model in detail, including but not limited to Word2Vec, Embeddings from Language Models, and Bidirectional Encoder Representations from Transformers. Furthermore, this paper analyzes advantage and disadvantages of existing models from multiple perspectives. Finally, the potential challenges of natural language representation model have been discussed in this survey.