Objective Metrics to Evaluate Residual-Echo Suppression During Double-Talk in the Stereophonic Case

Amir Ivry, I. Cohen, B. Berdugo
{"title":"Objective Metrics to Evaluate Residual-Echo Suppression During Double-Talk in the Stereophonic Case","authors":"Amir Ivry, I. Cohen, B. Berdugo","doi":"10.21437/interspeech.2022-673","DOIUrl":null,"url":null,"abstract":"Speech quality, as evaluated by humans, is most accurately as-sessed by subjective human ratings. The objective acoustic echo cancellation mean opinion score (AECMOS) metric was re-cently introduced and achieved high accuracy in predicting human perception during double-talk. Residual-echo suppression (RES) systems, however, employ the signal-to-distortion ratio (SDR) metric to quantify speech-quality in double-talk. In this study, we focus on stereophonic acoustic echo cancellation, and show that the stereo SDR (SSDR) poorly correlates with subjective human ratings according to the AECMOS, since the SSDR is influenced by both distortion of desired speech and presence of residual-echo. We introduce a pair of objective metrics that distinctly assess the stereo desired-speech maintained level (SDSML) and stereo residual-echo suppression level (SRESL) during double-talk. By employing a tunable RES system based on deep learning and using 100 hours of real and simulated recordings, the SDSML and SRESL metrics show high correlation with the AECMOS across various setups. We also investi-gate into how the design parameter governs the SDSML-SRESL tradeoff, and harness this relation to allow optimal performance for frequently-changing user demands in practical cases.","PeriodicalId":73500,"journal":{"name":"Interspeech","volume":"1 1","pages":"5348-5352"},"PeriodicalIF":0.0000,"publicationDate":"2022-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Interspeech","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21437/interspeech.2022-673","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Speech quality, as evaluated by humans, is most accurately as-sessed by subjective human ratings. The objective acoustic echo cancellation mean opinion score (AECMOS) metric was re-cently introduced and achieved high accuracy in predicting human perception during double-talk. Residual-echo suppression (RES) systems, however, employ the signal-to-distortion ratio (SDR) metric to quantify speech-quality in double-talk. In this study, we focus on stereophonic acoustic echo cancellation, and show that the stereo SDR (SSDR) poorly correlates with subjective human ratings according to the AECMOS, since the SSDR is influenced by both distortion of desired speech and presence of residual-echo. We introduce a pair of objective metrics that distinctly assess the stereo desired-speech maintained level (SDSML) and stereo residual-echo suppression level (SRESL) during double-talk. By employing a tunable RES system based on deep learning and using 100 hours of real and simulated recordings, the SDSML and SRESL metrics show high correlation with the AECMOS across various setups. We also investi-gate into how the design parameter governs the SDSML-SRESL tradeoff, and harness this relation to allow optimal performance for frequently-changing user demands in practical cases.
在立体声情况下评估双重通话期间残余回声抑制的客观指标
由人类评估的语音质量最准确地由人类的主观评级来评估。引入了客观声学回声消除平均意见得分(AECMOS)度量,并在预测双关语中的人类感知方面实现了高精度。然而,残余回声抑制(RES)系统采用信噪比(SDR)度量来量化双话中的语音质量。在这项研究中,我们专注于立体声回声消除,并表明根据AECMOS,立体声SDR(SSDR)与主观人类评级的相关性很差,因为SSDR受到所需语音失真和残余回声存在的影响。我们引入了一对客观指标,可以清楚地评估双通话期间的立体声期望语音保持水平(SDSML)和立体声残余回声抑制水平(SRESL)。通过采用基于深度学习的可调RES系统,并使用100小时的真实和模拟记录,SDSML和SRESL指标在各种设置中显示出与AECMOS的高度相关性。我们还研究了设计参数如何控制SDSML-SRESL权衡,并利用这种关系在实际情况下为频繁变化的用户需求提供最佳性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信