Wenming Huang, Yaowei Zhou, Yannan Xiao, Yayuan Wen, Zhenrong Deng
{"title":"基于变换和时间卷积网络的中文摘要生成","authors":"Wenming Huang, Yaowei Zhou, Yannan Xiao, Yayuan Wen, Zhenrong Deng","doi":"10.1117/12.2685723","DOIUrl":null,"url":null,"abstract":"The recurrent neural network model based on attention mechanism has achieved good results in the text summarization generation task, but such models have problems such as insufficient parallelism and exposure bias. In order to solve the above problems, this paper proposes a two-stage Chinese text summarization generation method based on Transformer and temporal convolutional network. The first stage uses a summary generation model that fuses Transformer and a temporal convolutional network, and generates multiple candidate summaries through beam search at the decoding end. In the second stage, contrastive learning is introduced, and the candidate summaries are sorted and scored using the Roberta model to select the final summary. Through experiments on the Chinese short text summarization dataset LCSTS, ROUGE was used as the evaluation method to verify the effectiveness of the proposed method on Chinese text summarization.","PeriodicalId":305812,"journal":{"name":"International Conference on Electronic Information Technology","volume":"37 9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Chinese text summarization generation based on transformer and temporal convolutional network\",\"authors\":\"Wenming Huang, Yaowei Zhou, Yannan Xiao, Yayuan Wen, Zhenrong Deng\",\"doi\":\"10.1117/12.2685723\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The recurrent neural network model based on attention mechanism has achieved good results in the text summarization generation task, but such models have problems such as insufficient parallelism and exposure bias. In order to solve the above problems, this paper proposes a two-stage Chinese text summarization generation method based on Transformer and temporal convolutional network. The first stage uses a summary generation model that fuses Transformer and a temporal convolutional network, and generates multiple candidate summaries through beam search at the decoding end. In the second stage, contrastive learning is introduced, and the candidate summaries are sorted and scored using the Roberta model to select the final summary. Through experiments on the Chinese short text summarization dataset LCSTS, ROUGE was used as the evaluation method to verify the effectiveness of the proposed method on Chinese text summarization.\",\"PeriodicalId\":305812,\"journal\":{\"name\":\"International Conference on Electronic Information Technology\",\"volume\":\"37 9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Conference on Electronic Information Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.2685723\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Electronic Information Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.2685723","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Chinese text summarization generation based on transformer and temporal convolutional network
The recurrent neural network model based on attention mechanism has achieved good results in the text summarization generation task, but such models have problems such as insufficient parallelism and exposure bias. In order to solve the above problems, this paper proposes a two-stage Chinese text summarization generation method based on Transformer and temporal convolutional network. The first stage uses a summary generation model that fuses Transformer and a temporal convolutional network, and generates multiple candidate summaries through beam search at the decoding end. In the second stage, contrastive learning is introduced, and the candidate summaries are sorted and scored using the Roberta model to select the final summary. Through experiments on the Chinese short text summarization dataset LCSTS, ROUGE was used as the evaluation method to verify the effectiveness of the proposed method on Chinese text summarization.