在 EM 框架中使用基于修改后 L1 截断惩罚的混合阈值进行图像解卷积

IF 3.4 2区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Ravi Pratap Singh , Manoj Kumar Singh
{"title":"在 EM 框架中使用基于修改后 L1 截断惩罚的混合阈值进行图像解卷积","authors":"Ravi Pratap Singh ,&nbsp;Manoj Kumar Singh","doi":"10.1016/j.sigpro.2024.109725","DOIUrl":null,"url":null,"abstract":"<div><div>Image deconvolution remains a challenging task due to its inherent ill-posedness. While existing algorithms show strong numerical performance, their complexity often complicates analysis and implementation. This paper introduces a computationally efficient image deconvolution method within the expectation maximization (EM) framework. The proposed algorithm alternates between an E-step leveraging the fast Fourier transform (FFT) and an M-step utilizing the discrete wavelet transform (DWT). In the M-step, we introduce a novel L<sub>1</sub>-clipped penalty to compute the maximum a posteriori (MAP) estimate, resulting in a hybrid threshold that combines the strengths of soft and hard thresholding. This hybrid threshold is mathematically derived, overcoming the high variance of hard-thresholding and the high bias of soft-thresholding, thus optimizing the trade-off between variance and bias. Extensive experiments demonstrate that our method significantly outperforms state-of-the-art techniques in terms of improved signal-to-noise ratio (ISNR) and peak signal-to-noise ratio (PSNR), as well as visual quality. Notably, the proposed method shows average PSNR improvements of 3.49 dB, 4.23 dB, and 1.44 dB for uniform blur and 0.76 dB, 3.57 dB, and 0.66 dB for Gaussian blur on the Set12, BSD68, and Set14 datasets, respectively.</div></div>","PeriodicalId":49523,"journal":{"name":"Signal Processing","volume":"227 ","pages":"Article 109725"},"PeriodicalIF":3.4000,"publicationDate":"2024-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Image deconvolution using hybrid threshold based on modified L1-clipped penalty in EM framework\",\"authors\":\"Ravi Pratap Singh ,&nbsp;Manoj Kumar Singh\",\"doi\":\"10.1016/j.sigpro.2024.109725\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Image deconvolution remains a challenging task due to its inherent ill-posedness. While existing algorithms show strong numerical performance, their complexity often complicates analysis and implementation. This paper introduces a computationally efficient image deconvolution method within the expectation maximization (EM) framework. The proposed algorithm alternates between an E-step leveraging the fast Fourier transform (FFT) and an M-step utilizing the discrete wavelet transform (DWT). In the M-step, we introduce a novel L<sub>1</sub>-clipped penalty to compute the maximum a posteriori (MAP) estimate, resulting in a hybrid threshold that combines the strengths of soft and hard thresholding. This hybrid threshold is mathematically derived, overcoming the high variance of hard-thresholding and the high bias of soft-thresholding, thus optimizing the trade-off between variance and bias. Extensive experiments demonstrate that our method significantly outperforms state-of-the-art techniques in terms of improved signal-to-noise ratio (ISNR) and peak signal-to-noise ratio (PSNR), as well as visual quality. Notably, the proposed method shows average PSNR improvements of 3.49 dB, 4.23 dB, and 1.44 dB for uniform blur and 0.76 dB, 3.57 dB, and 0.66 dB for Gaussian blur on the Set12, BSD68, and Set14 datasets, respectively.</div></div>\",\"PeriodicalId\":49523,\"journal\":{\"name\":\"Signal Processing\",\"volume\":\"227 \",\"pages\":\"Article 109725\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-09-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Signal Processing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0165168424003451\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0165168424003451","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

摘要

图像解卷积由于其固有的问题性,仍然是一项具有挑战性的任务。虽然现有算法显示出很强的数值性能,但其复杂性往往使分析和实现变得复杂。本文在期望最大化(EM)框架内介绍了一种计算高效的图像解卷积方法。所提出的算法在利用快速傅立叶变换(FFT)的 E 步和利用离散小波变换(DWT)的 M 步之间交替进行。在 M 步中,我们引入了一种新颖的 L1 截断惩罚来计算最大后验(MAP)估计值,从而产生了一种混合阈值,它结合了软阈值和硬阈值的优点。这种混合阈值是通过数学计算得出的,克服了硬阈值的高方差和软阈值的高偏差,从而优化了方差和偏差之间的权衡。广泛的实验证明,我们的方法在改善信噪比(ISNR)和峰值信噪比(PSNR)以及视觉质量方面明显优于最先进的技术。值得注意的是,在 Set12、BSD68 和 Set14 数据集上,所提出的方法对均匀模糊的平均 PSNR 分别提高了 3.49 dB、4.23 dB 和 1.44 dB,对高斯模糊的平均 PSNR 分别提高了 0.76 dB、3.57 dB 和 0.66 dB。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Image deconvolution using hybrid threshold based on modified L1-clipped penalty in EM framework
Image deconvolution remains a challenging task due to its inherent ill-posedness. While existing algorithms show strong numerical performance, their complexity often complicates analysis and implementation. This paper introduces a computationally efficient image deconvolution method within the expectation maximization (EM) framework. The proposed algorithm alternates between an E-step leveraging the fast Fourier transform (FFT) and an M-step utilizing the discrete wavelet transform (DWT). In the M-step, we introduce a novel L1-clipped penalty to compute the maximum a posteriori (MAP) estimate, resulting in a hybrid threshold that combines the strengths of soft and hard thresholding. This hybrid threshold is mathematically derived, overcoming the high variance of hard-thresholding and the high bias of soft-thresholding, thus optimizing the trade-off between variance and bias. Extensive experiments demonstrate that our method significantly outperforms state-of-the-art techniques in terms of improved signal-to-noise ratio (ISNR) and peak signal-to-noise ratio (PSNR), as well as visual quality. Notably, the proposed method shows average PSNR improvements of 3.49 dB, 4.23 dB, and 1.44 dB for uniform blur and 0.76 dB, 3.57 dB, and 0.66 dB for Gaussian blur on the Set12, BSD68, and Set14 datasets, respectively.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Signal Processing
Signal Processing 工程技术-工程:电子与电气
CiteScore
9.20
自引率
9.10%
发文量
309
审稿时长
41 days
期刊介绍: Signal Processing incorporates all aspects of the theory and practice of signal processing. It features original research work, tutorial and review articles, and accounts of practical developments. It is intended for a rapid dissemination of knowledge and experience to engineers and scientists working in the research, development or practical application of signal processing. Subject areas covered by the journal include: Signal Theory; Stochastic Processes; Detection and Estimation; Spectral Analysis; Filtering; Signal Processing Systems; Software Developments; Image Processing; Pattern Recognition; Optical Signal Processing; Digital Signal Processing; Multi-dimensional Signal Processing; Communication Signal Processing; Biomedical Signal Processing; Geophysical and Astrophysical Signal Processing; Earth Resources Signal Processing; Acoustic and Vibration Signal Processing; Data Processing; Remote Sensing; Signal Processing Technology; Radar Signal Processing; Sonar Signal Processing; Industrial Applications; New Applications.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信