Anil Bas, M. O. Topal, Çağdaş Duman, Imke van Heerden
{"title":"基于深度学习的文本生成简史","authors":"Anil Bas, M. O. Topal, Çağdaş Duman, Imke van Heerden","doi":"10.1109/ICCA56443.2022.10039545","DOIUrl":null,"url":null,"abstract":"A dynamic domain in Artificial Intelligence research, Natural Language Generation centres on the automatic generation of realistic text. To help navigate this vast and swiftly developing body of work, the study provides a concise overview of noteworthy stages in the history of text generation. To this end, the paper describes deep learning models for a broad audience, focusing on traditional, convolutional, recurrent and generative adversarial networks, as well as transformer architecture.","PeriodicalId":153139,"journal":{"name":"2022 International Conference on Computer and Applications (ICCA)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Brief History of Deep Learning-Based Text Generation\",\"authors\":\"Anil Bas, M. O. Topal, Çağdaş Duman, Imke van Heerden\",\"doi\":\"10.1109/ICCA56443.2022.10039545\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A dynamic domain in Artificial Intelligence research, Natural Language Generation centres on the automatic generation of realistic text. To help navigate this vast and swiftly developing body of work, the study provides a concise overview of noteworthy stages in the history of text generation. To this end, the paper describes deep learning models for a broad audience, focusing on traditional, convolutional, recurrent and generative adversarial networks, as well as transformer architecture.\",\"PeriodicalId\":153139,\"journal\":{\"name\":\"2022 International Conference on Computer and Applications (ICCA)\",\"volume\":\"67 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Computer and Applications (ICCA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICCA56443.2022.10039545\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Computer and Applications (ICCA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCA56443.2022.10039545","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Brief History of Deep Learning-Based Text Generation
A dynamic domain in Artificial Intelligence research, Natural Language Generation centres on the automatic generation of realistic text. To help navigate this vast and swiftly developing body of work, the study provides a concise overview of noteworthy stages in the history of text generation. To this end, the paper describes deep learning models for a broad audience, focusing on traditional, convolutional, recurrent and generative adversarial networks, as well as transformer architecture.