Abstractive Dialog Summarization using Two Stage Framework with Contrastive Learning

Sudipto Dip Halder, Mahit Kumar Paul, Bayezid Islam
{"title":"Abstractive Dialog Summarization using Two Stage Framework with Contrastive Learning","authors":"Sudipto Dip Halder, Mahit Kumar Paul, Bayezid Islam","doi":"10.1109/ICCIT57492.2022.10055286","DOIUrl":null,"url":null,"abstract":"In the modern era, a large amount of text conversation data between two or more interlocutors is generated by different online service consumers every hour. Converting such a long conversation into a concise form is more useful for further analysis and can boost service quality when conducted in an efficient manner. Abstractive summarization models usually suffer from performance degradation due to the different objective functions used in the training and inference steps. Contrastive learning is a powerful technique for developing training objectives that are similar to evaluation metrics and thus improve performance. Two-stage framework with contrastive learning are gaining popularity to mitigate this gap but this approach is very daunting in the field because of its huge computation time and demand for memory usage. To address this issue, we propose an optimization in the two-stage framework architecture for dialog summarization using the ALBERT pre-trained model in the evaluator section which is more efficient with respect to the usage of resources. Our method significantly outperforms strong baselines on SAMSum and DialogSum dataset for abstractive dialog summarization task.","PeriodicalId":255498,"journal":{"name":"2022 25th International Conference on Computer and Information Technology (ICCIT)","volume":"72 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 25th International Conference on Computer and Information Technology (ICCIT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCIT57492.2022.10055286","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In the modern era, a large amount of text conversation data between two or more interlocutors is generated by different online service consumers every hour. Converting such a long conversation into a concise form is more useful for further analysis and can boost service quality when conducted in an efficient manner. Abstractive summarization models usually suffer from performance degradation due to the different objective functions used in the training and inference steps. Contrastive learning is a powerful technique for developing training objectives that are similar to evaluation metrics and thus improve performance. Two-stage framework with contrastive learning are gaining popularity to mitigate this gap but this approach is very daunting in the field because of its huge computation time and demand for memory usage. To address this issue, we propose an optimization in the two-stage framework architecture for dialog summarization using the ALBERT pre-trained model in the evaluator section which is more efficient with respect to the usage of resources. Our method significantly outperforms strong baselines on SAMSum and DialogSum dataset for abstractive dialog summarization task.
两阶段框架下的抽象对话总结与对比学习
在当今时代,不同的在线服务消费者每小时都会产生两个或多个对话者之间的大量文本会话数据。将如此长的对话转换成简洁的形式对于进一步分析更有用,并且如果以有效的方式进行,可以提高服务质量。抽象摘要模型由于在训练和推理步骤中使用不同的目标函数而导致性能下降。对比学习是一种强大的技术,用于制定类似于评估指标的培训目标,从而提高绩效。带有对比学习的两阶段框架越来越受欢迎,以缓解这种差距,但由于其巨大的计算时间和对内存使用的需求,这种方法在该领域非常令人生畏。为了解决这个问题,我们在评估器部分使用ALBERT预训练模型对对话总结的两阶段框架架构进行优化,这在资源使用方面更有效。对于抽象的对话摘要任务,我们的方法明显优于SAMSum和DialogSum数据集上的强基线。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信