Multi-document abstractive summarization using chunk-graph and recurrent neural network

J. Niu, Huanpei Chen, Qingjuan Zhao, Limin Su, Mohammed Atiquzzaman
{"title":"Multi-document abstractive summarization using chunk-graph and recurrent neural network","authors":"J. Niu, Huanpei Chen, Qingjuan Zhao, Limin Su, Mohammed Atiquzzaman","doi":"10.1109/ICC.2017.7996331","DOIUrl":null,"url":null,"abstract":"Automatic multi-document abstractive summarization system is used to summarize several documents into a short one with generated new sentences. Many of them are based on word-graph and ILP method, and lots of sentences are ignored because of the heavy computation load. To reduce computation and generate readable and informative summaries, we propose a novel abstractive multi-document summarization system based on chunk-graph (CG) and recurrent neural network language model (RNNLM). In our approach, A CG which is based on word-graph is constructed to organize all information in a sentence cluster, CG can reduce the size of graph and keep more semantic information than word-graph. We use beam search and character-level RNNLM to generate readable and informative summaries from the CG for each sentence cluster, RNNLM is a better model to evaluate sentence linguistic quality than n-gram language model. Experimental results show that our proposed system outperforms all baseline systems and reach the state-of-art systems, and the system with CG can generate better summaries than that with ordinary word-graph.","PeriodicalId":6517,"journal":{"name":"2017 IEEE International Conference on Communications (ICC)","volume":"36 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2017-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE International Conference on Communications (ICC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICC.2017.7996331","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 15

Abstract

Automatic multi-document abstractive summarization system is used to summarize several documents into a short one with generated new sentences. Many of them are based on word-graph and ILP method, and lots of sentences are ignored because of the heavy computation load. To reduce computation and generate readable and informative summaries, we propose a novel abstractive multi-document summarization system based on chunk-graph (CG) and recurrent neural network language model (RNNLM). In our approach, A CG which is based on word-graph is constructed to organize all information in a sentence cluster, CG can reduce the size of graph and keep more semantic information than word-graph. We use beam search and character-level RNNLM to generate readable and informative summaries from the CG for each sentence cluster, RNNLM is a better model to evaluate sentence linguistic quality than n-gram language model. Experimental results show that our proposed system outperforms all baseline systems and reach the state-of-art systems, and the system with CG can generate better summaries than that with ordinary word-graph.
基于块图和递归神经网络的多文档抽象摘要
自动多文档抽象摘要系统用于将多个文档总结为一个简短的文档,并生成新的句子。其中很多是基于词图和ILP方法,由于计算量大,很多句子被忽略了。为了减少计算量并生成可读且信息丰富的摘要,我们提出了一种基于块图(CG)和递归神经网络语言模型(RNNLM)的抽象多文档摘要系统。在本文的方法中,我们构建了一个基于词图的CG来组织句子簇中的所有信息,CG可以减少图的大小,并且比词图保留更多的语义信息。我们使用束搜索和字符级RNNLM从每个句子簇的CG中生成可读且信息丰富的摘要,RNNLM是比n-gram语言模型更好的句子语言质量评估模型。实验结果表明,我们提出的系统优于所有基线系统,达到了最先进的系统水平,使用CG的系统比使用普通词图的系统可以更好地生成摘要。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信