{"title":"用快速文本转换网络生成中文故事","authors":"Jhe-Wei Lin, Yunwen Gao, Rong-Guey Chang","doi":"10.1109/ICAIIC.2019.8669087","DOIUrl":null,"url":null,"abstract":"The sequence transformer models are based on complex recurrent neural network or convolutional networks that include an encoder and a decoder. High-accuracy models are usually represented by used connect the encoder and decoder through an attention mechanism. Story generation is an important thing. If we can let computers learn the ability of story-telling, computers can help people do more things. Actually, the squence2squence model combine attention mechanism is being used to Chinese poetry generation. However, it difficult to apply in Chinese story generation, because there are some rules in Chinese poetry generation. Therefore, we trying to use 1372 human-labeled summarization of paragraphs from a classic novel named “Demi-Gods and Semi-Devils” (天龍八部) to train the transformer network. In our experiment, we use FastText to combine Demi-Gods and Semi-Devils Dataset and A Large Scale Chinese Short Text Summarization Dataset to be input data. In addition, we got a lower loss rate by using two layer of self-attention mechanism.","PeriodicalId":273383,"journal":{"name":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","volume":"35 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Chinese Story Generation with FastText Transformer Network\",\"authors\":\"Jhe-Wei Lin, Yunwen Gao, Rong-Guey Chang\",\"doi\":\"10.1109/ICAIIC.2019.8669087\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The sequence transformer models are based on complex recurrent neural network or convolutional networks that include an encoder and a decoder. High-accuracy models are usually represented by used connect the encoder and decoder through an attention mechanism. Story generation is an important thing. If we can let computers learn the ability of story-telling, computers can help people do more things. Actually, the squence2squence model combine attention mechanism is being used to Chinese poetry generation. However, it difficult to apply in Chinese story generation, because there are some rules in Chinese poetry generation. Therefore, we trying to use 1372 human-labeled summarization of paragraphs from a classic novel named “Demi-Gods and Semi-Devils” (天龍八部) to train the transformer network. In our experiment, we use FastText to combine Demi-Gods and Semi-Devils Dataset and A Large Scale Chinese Short Text Summarization Dataset to be input data. In addition, we got a lower loss rate by using two layer of self-attention mechanism.\",\"PeriodicalId\":273383,\"journal\":{\"name\":\"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"volume\":\"35 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAIIC.2019.8669087\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 International Conference on Artificial Intelligence in Information and Communication (ICAIIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAIIC.2019.8669087","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Chinese Story Generation with FastText Transformer Network
The sequence transformer models are based on complex recurrent neural network or convolutional networks that include an encoder and a decoder. High-accuracy models are usually represented by used connect the encoder and decoder through an attention mechanism. Story generation is an important thing. If we can let computers learn the ability of story-telling, computers can help people do more things. Actually, the squence2squence model combine attention mechanism is being used to Chinese poetry generation. However, it difficult to apply in Chinese story generation, because there are some rules in Chinese poetry generation. Therefore, we trying to use 1372 human-labeled summarization of paragraphs from a classic novel named “Demi-Gods and Semi-Devils” (天龍八部) to train the transformer network. In our experiment, we use FastText to combine Demi-Gods and Semi-Devils Dataset and A Large Scale Chinese Short Text Summarization Dataset to be input data. In addition, we got a lower loss rate by using two layer of self-attention mechanism.