{"title":"Abstractive summarization by neural attention model with document content memory","authors":"YunSeok Choi, Dahae Kim, Jee-Hyong Lee","doi":"10.1145/3264746.3264773","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a generative approach for abstractive summarization, which creates summaries based on a language model. The main goal of our paper is to generate a long sequence of words with coherent sentences by reflecting the key concepts of the original document and the characteristics of summaries. To achieve this goal, we propose an attention mechanism that uses Document Content Memory for learning the language model effectively. To evaluate its effectiveness, the proposed methods are compared with other language models and an extractive summarization method. The results demonstrated that the proposed methods could be competitive with other approaches.","PeriodicalId":186790,"journal":{"name":"Proceedings of the 2018 Conference on Research in Adaptive and Convergent Systems","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 Conference on Research in Adaptive and Convergent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3264746.3264773","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In this paper, we propose a generative approach for abstractive summarization, which creates summaries based on a language model. The main goal of our paper is to generate a long sequence of words with coherent sentences by reflecting the key concepts of the original document and the characteristics of summaries. To achieve this goal, we propose an attention mechanism that uses Document Content Memory for learning the language model effectively. To evaluate its effectiveness, the proposed methods are compared with other language models and an extractive summarization method. The results demonstrated that the proposed methods could be competitive with other approaches.