{"title":"Survey on Automatic Text Summarization and Transformer Models Applicability","authors":"Guan Wang, I. Smetannikov, T. Man","doi":"10.1145/3437802.3437832","DOIUrl":null,"url":null,"abstract":"This survey talks about Automatic Text Summarization. Information explosion, the problem caused by the rapid growth of the internet, increased more and more necessity of powerful summarizers. This article briefly reviews different methods and evaluation metrics. The main attention is on the applications of the latest trends, neural network-based, and pre-trained transformer language models. Pre-trained language models now are ruling the NLP field, as one of the main down-stream tasks, Automatic Text Summarization is quite an interdisciplinary task and requires more advanced techniques. But there is a limitation of input and context length results in that the whole article cannot be encoded completely. Motivated by the application of recurrent mechanism in Transformer-XL, we build an abstractive summarizer for long text and evaluate how well it performs on dataset CNN/Daily Mail. The model is under general sequence to sequence structure with a recurrent encoder and stacked Transformer decoder. The obtained ROUGE scores tell that the performance is good as expected.","PeriodicalId":429866,"journal":{"name":"Proceedings of the 2020 1st International Conference on Control, Robotics and Intelligent System","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2020 1st International Conference on Control, Robotics and Intelligent System","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3437802.3437832","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12
Abstract
This survey talks about Automatic Text Summarization. Information explosion, the problem caused by the rapid growth of the internet, increased more and more necessity of powerful summarizers. This article briefly reviews different methods and evaluation metrics. The main attention is on the applications of the latest trends, neural network-based, and pre-trained transformer language models. Pre-trained language models now are ruling the NLP field, as one of the main down-stream tasks, Automatic Text Summarization is quite an interdisciplinary task and requires more advanced techniques. But there is a limitation of input and context length results in that the whole article cannot be encoded completely. Motivated by the application of recurrent mechanism in Transformer-XL, we build an abstractive summarizer for long text and evaluate how well it performs on dataset CNN/Daily Mail. The model is under general sequence to sequence structure with a recurrent encoder and stacked Transformer decoder. The obtained ROUGE scores tell that the performance is good as expected.