Abstractive Summarization Model with a Feature-Enhanced Seq2Seq Structure

Zepeng Hao, Jingzhou Ji, Tao Xie, Bin Xue
{"title":"Abstractive Summarization Model with a Feature-Enhanced Seq2Seq Structure","authors":"Zepeng Hao, Jingzhou Ji, Tao Xie, Bin Xue","doi":"10.1109/ACIRS49895.2020.9162627","DOIUrl":null,"url":null,"abstract":"Abstractive text summarization task is mainly through deep learning method to summarize one or more documents to produce a concise summary that can express the main meaning of the document. Most methods are mainly based on the traditional Seq2Seq structure, but the traditional Seq2Seq structure has limited ability to capture and store long-term features and global features, resulting in a lack of information in the generated summary. In our paper, we put forward a new abstractive summarization model based on feature-enhanced Seq2Seq structure for single document summarization task. This model utilizes two types of feature capture networks to improve the encoder and decoder in traditional Seq2Seq structure, to enhance the model’s ability to capture and store long-term features and global features, so that the generated summary more informative and more fluency. Finally, we verified the model we proposed on the CNN/DailyMail dataset. Experimental results demonstrate that the model proposed in this paper is more effective than the baseline model, and has improved by 5.6%, 5.3%, 6.2% on the three metrics R-1, R-2, and R-L.","PeriodicalId":293428,"journal":{"name":"2020 5th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","volume":"62 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 5th Asia-Pacific Conference on Intelligent Robot Systems (ACIRS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACIRS49895.2020.9162627","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Abstractive text summarization task is mainly through deep learning method to summarize one or more documents to produce a concise summary that can express the main meaning of the document. Most methods are mainly based on the traditional Seq2Seq structure, but the traditional Seq2Seq structure has limited ability to capture and store long-term features and global features, resulting in a lack of information in the generated summary. In our paper, we put forward a new abstractive summarization model based on feature-enhanced Seq2Seq structure for single document summarization task. This model utilizes two types of feature capture networks to improve the encoder and decoder in traditional Seq2Seq structure, to enhance the model’s ability to capture and store long-term features and global features, so that the generated summary more informative and more fluency. Finally, we verified the model we proposed on the CNN/DailyMail dataset. Experimental results demonstrate that the model proposed in this paper is more effective than the baseline model, and has improved by 5.6%, 5.3%, 6.2% on the three metrics R-1, R-2, and R-L.
基于特征增强Seq2Seq结构的抽象摘要模型
抽象文本摘要任务主要是通过深度学习的方法对一个或多个文档进行摘要,生成能够表达文档主要含义的简明摘要。大多数方法主要基于传统的Seq2Seq结构,但传统的Seq2Seq结构对长期特征和全局特征的捕获和存储能力有限,导致生成的摘要信息缺乏。本文针对单个文档摘要任务,提出了一种基于特征增强的Seq2Seq结构的抽象摘要模型。该模型利用两类特征捕获网络对传统Seq2Seq结构中的编码器和解码器进行改进,增强模型对长期特征和全局特征的捕获和存储能力,使生成的摘要信息量更大、更流畅。最后,我们在CNN/DailyMail数据集上验证了我们提出的模型。实验结果表明,本文提出的模型比基线模型更有效,在R-1、R-2和R-L三个指标上分别提高了5.6%、5.3%、6.2%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信