Alias minimization of 1-D signals using DCT based learning

Prashant Garg, M. Maheshwari, Sameer Dubey, M. Joshi, Vijaykumar Chakka, A. Banerjee
{"title":"Alias minimization of 1-D signals using DCT based learning","authors":"Prashant Garg, M. Maheshwari, Sameer Dubey, M. Joshi, Vijaykumar Chakka, A. Banerjee","doi":"10.1109/MWSCAS.2010.5548852","DOIUrl":null,"url":null,"abstract":"In this paper, we propose a learning based approach for alias minimization of 1-D signals. Given an under-sampled test speech signal and a training set consisting of several speech signals each of which are under-sampled as well as sampled at greater than Nyquist rate, we estimate the non-aliased frequencies for the test signal using the training set. The learning of non-aliased frequencies corresponds to estimating them using a training set. The test signal and each of the under-sampled training set signal are first interpolated to the size of The non-aliased signals. They are then divided into a number of segments and discrete cosine transform (DCT) is computed for each segment. Assuming that the lower frequencies are non-aliased and minimally distorted, we replace the aliased DCT coefficients of the test signal with the best search from the training set. The non-aliased test signal is then re-constructed by taking the inverse DCT. The comparison with the standard interpolation technique in terms of both subjective and quantitative analysis indicates better performance.","PeriodicalId":245322,"journal":{"name":"2010 53rd IEEE International Midwest Symposium on Circuits and Systems","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 53rd IEEE International Midwest Symposium on Circuits and Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MWSCAS.2010.5548852","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this paper, we propose a learning based approach for alias minimization of 1-D signals. Given an under-sampled test speech signal and a training set consisting of several speech signals each of which are under-sampled as well as sampled at greater than Nyquist rate, we estimate the non-aliased frequencies for the test signal using the training set. The learning of non-aliased frequencies corresponds to estimating them using a training set. The test signal and each of the under-sampled training set signal are first interpolated to the size of The non-aliased signals. They are then divided into a number of segments and discrete cosine transform (DCT) is computed for each segment. Assuming that the lower frequencies are non-aliased and minimally distorted, we replace the aliased DCT coefficients of the test signal with the best search from the training set. The non-aliased test signal is then re-constructed by taking the inverse DCT. The comparison with the standard interpolation technique in terms of both subjective and quantitative analysis indicates better performance.
基于DCT学习的一维信号混叠最小化
在本文中,我们提出了一种基于学习的一维信号混叠最小化方法。给定一个欠采样的测试语音信号和一个由几个语音信号组成的训练集,每个语音信号都是欠采样的,并且采样率大于奈奎斯特率,我们使用训练集估计测试信号的非混频。非混叠频率的学习对应于使用训练集估计它们。首先将测试信号和每个欠采样训练集信号插值到非混叠信号的大小。然后将它们分成若干段,并为每个段计算离散余弦变换(DCT)。假设较低的频率没有混叠且失真最小,我们用训练集中的最佳搜索来替换测试信号的混叠DCT系数。然后通过逆DCT重构无混叠测试信号。从主观分析和定量分析两方面与标准插值技术进行了比较,结果表明该方法具有更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信