{"title":"基于二次检验的提取摘要","authors":"Yanfang Cheng, Yinan Lu","doi":"10.1145/3331453.3361304","DOIUrl":null,"url":null,"abstract":"Aiming to solve the problems of insufficient semantic understanding and low statement accuracy in the automatic text summarization of natural language processing, in this paper we present a novel neural network method for extractive summarization by Quadratic Check. The sentence extractor extracts the primary sentences by a scoring and selecting module, among which it selects the final sentences based on impact factor computed by the transformer model and builds the output summary by the quadratic check. The transformer model based on attention mechanisms is trained and generates the probability impact factor of each sentence used for further predicting the relative importance of the sentences. Meanwhile word frequency and position information are used in the process of extractive summarization. Finally, the effectiveness of the proposed method is verified by the experiments on CNN/Daily Mail and DUC2002 datasets.","PeriodicalId":162067,"journal":{"name":"Proceedings of the 3rd International Conference on Computer Science and Application Engineering","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Extractive Summarization Based on Quadratic Check\",\"authors\":\"Yanfang Cheng, Yinan Lu\",\"doi\":\"10.1145/3331453.3361304\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Aiming to solve the problems of insufficient semantic understanding and low statement accuracy in the automatic text summarization of natural language processing, in this paper we present a novel neural network method for extractive summarization by Quadratic Check. The sentence extractor extracts the primary sentences by a scoring and selecting module, among which it selects the final sentences based on impact factor computed by the transformer model and builds the output summary by the quadratic check. The transformer model based on attention mechanisms is trained and generates the probability impact factor of each sentence used for further predicting the relative importance of the sentences. Meanwhile word frequency and position information are used in the process of extractive summarization. Finally, the effectiveness of the proposed method is verified by the experiments on CNN/Daily Mail and DUC2002 datasets.\",\"PeriodicalId\":162067,\"journal\":{\"name\":\"Proceedings of the 3rd International Conference on Computer Science and Application Engineering\",\"volume\":\"23 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 3rd International Conference on Computer Science and Application Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3331453.3361304\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 3rd International Conference on Computer Science and Application Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3331453.3361304","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Aiming to solve the problems of insufficient semantic understanding and low statement accuracy in the automatic text summarization of natural language processing, in this paper we present a novel neural network method for extractive summarization by Quadratic Check. The sentence extractor extracts the primary sentences by a scoring and selecting module, among which it selects the final sentences based on impact factor computed by the transformer model and builds the output summary by the quadratic check. The transformer model based on attention mechanisms is trained and generates the probability impact factor of each sentence used for further predicting the relative importance of the sentences. Meanwhile word frequency and position information are used in the process of extractive summarization. Finally, the effectiveness of the proposed method is verified by the experiments on CNN/Daily Mail and DUC2002 datasets.