Team MTS @ AutoMin 2021: An Overview of Existing Summarization Approaches and Comparison to Unsupervised Summarization Techniques

O. Iakovenko, A. Andreeva, Anna Lapidus, Liana Mikaelyan
{"title":"Team MTS @ AutoMin 2021: An Overview of Existing Summarization Approaches and Comparison to Unsupervised Summarization Techniques","authors":"O. Iakovenko, A. Andreeva, Anna Lapidus, Liana Mikaelyan","doi":"10.21437/automin.2021-7","DOIUrl":null,"url":null,"abstract":"Remote communication through video or audio conferences has become more popular than ever because of the worldwide pan-demic. These events, therefore, have provoked the development of systems for automatic minuting of spoken language leading to AutoMin 2021 challenge. The following paper illustrates the results of the research that team MTS has carried out while par-ticipating in the Automatic Minutes challenge. In particular, in this paper we analyze existing approaches to text and speech summarization, propose an unsupervised summarization technique based on clustering and provide a pipeline that includes an adapted automatic speech recognition block able to run on real-life recordings. The proposed unsupervised technique out-performs pre-trained summarization models on the automatic minuting task with Rouge 1, Rouge 2 and Rouge L values of 0.21, 0.02 and 0.2 on the dev set, with Rouge 1, Rouge 2, Rouge L, Adequacy, Grammatical correctness and Fluency values of 0.180, 0.035, 0.098, 1.857, 2.304, 1.911 on the test set accord-ingly","PeriodicalId":186820,"journal":{"name":"First Shared Task on Automatic Minuting at Interspeech 2021","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"First Shared Task on Automatic Minuting at Interspeech 2021","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21437/automin.2021-7","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

Abstract

Remote communication through video or audio conferences has become more popular than ever because of the worldwide pan-demic. These events, therefore, have provoked the development of systems for automatic minuting of spoken language leading to AutoMin 2021 challenge. The following paper illustrates the results of the research that team MTS has carried out while par-ticipating in the Automatic Minutes challenge. In particular, in this paper we analyze existing approaches to text and speech summarization, propose an unsupervised summarization technique based on clustering and provide a pipeline that includes an adapted automatic speech recognition block able to run on real-life recordings. The proposed unsupervised technique out-performs pre-trained summarization models on the automatic minuting task with Rouge 1, Rouge 2 and Rouge L values of 0.21, 0.02 and 0.2 on the dev set, with Rouge 1, Rouge 2, Rouge L, Adequacy, Grammatical correctness and Fluency values of 0.180, 0.035, 0.098, 1.857, 2.304, 1.911 on the test set accord-ingly
团队MTS @ AutoMin 2021:现有摘要方法概述及与无监督摘要技术的比较
由于全球范围的流行病,通过视频或音频会议进行远程通信比以往任何时候都更受欢迎。因此,这些事件刺激了口语自动记录系统的开发,从而导致了AutoMin 2021的挑战。下面的论文说明了MTS团队在参加自动会议挑战时所进行的研究结果。特别地,在本文中,我们分析了现有的文本和语音摘要方法,提出了一种基于聚类的无监督摘要技术,并提供了一个管道,其中包括一个能够在真实记录上运行的自适应自动语音识别块。所提出的无监督技术在自动记录任务上优于预训练摘要模型,在开发集上Rouge 1、Rouge 2和Rouge L的值分别为0.21、0.02和0.2,在测试集上Rouge 1、Rouge 2、Rouge L、充足性、语法正确性和流畅性的值分别为0.180、0.035、0.098、1.857、2.304和1.911
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信