GAN-Based Image Deblurring Using DCT Discriminator

Hiroki Tomosada, Takahiro Kudo, Takanori Fujisawa, M. Ikehara
{"title":"GAN-Based Image Deblurring Using DCT Discriminator","authors":"Hiroki Tomosada, Takahiro Kudo, Takanori Fujisawa, M. Ikehara","doi":"10.1109/ICPR48806.2021.9412584","DOIUrl":null,"url":null,"abstract":"In this paper, we propose high quality image debluring by using discrete cosine transform (DCT) with less computational complexity. Recently, Convolutional Neural Network (CNN) and Generative Adversarial Network (GAN) based algorithms have been proposed for image deblurring. Moreover, multi-scale architecture of CNN restores blurred image cleary and suppresses more ringing artifacts or block noise, but it takes much time to process. To solve these problems, we propose a method that preserves texture and suppresses ringing artifacts in the restored image without multi-scale architecture using DCT based loss named “DeblurDCTGAN.”. It compares frequency domain of the images made from deblurred image and ground truth image by using DCT. Hereby, DeblurDCTGAN can reduce block noise or ringing artifacts while maintaining deblurring performance. Our experimental results show that DeblurDCTGAN gets the highest performances on both PSNR and SSIM comparing with other conventional methods in GoPro, DVD, NFS and HIDE test Dataset. Also, the running time per pair of DeblurDCTGAN is faster than others.","PeriodicalId":6783,"journal":{"name":"2020 25th International Conference on Pattern Recognition (ICPR)","volume":"10 1","pages":"3675-3681"},"PeriodicalIF":0.0000,"publicationDate":"2021-01-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 25th International Conference on Pattern Recognition (ICPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICPR48806.2021.9412584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

In this paper, we propose high quality image debluring by using discrete cosine transform (DCT) with less computational complexity. Recently, Convolutional Neural Network (CNN) and Generative Adversarial Network (GAN) based algorithms have been proposed for image deblurring. Moreover, multi-scale architecture of CNN restores blurred image cleary and suppresses more ringing artifacts or block noise, but it takes much time to process. To solve these problems, we propose a method that preserves texture and suppresses ringing artifacts in the restored image without multi-scale architecture using DCT based loss named “DeblurDCTGAN.”. It compares frequency domain of the images made from deblurred image and ground truth image by using DCT. Hereby, DeblurDCTGAN can reduce block noise or ringing artifacts while maintaining deblurring performance. Our experimental results show that DeblurDCTGAN gets the highest performances on both PSNR and SSIM comparing with other conventional methods in GoPro, DVD, NFS and HIDE test Dataset. Also, the running time per pair of DeblurDCTGAN is faster than others.
基于DCT鉴别器的gan图像去模糊
在本文中,我们提出了高质量的图像去模糊使用离散余弦变换(DCT)具有较低的计算复杂度。近年来,基于卷积神经网络(CNN)和生成对抗网络(GAN)的图像去模糊算法被提出。此外,CNN的多尺度架构可以清晰地恢复模糊图像,抑制更多的振铃伪影或块噪声,但处理时间较长。为了解决这些问题,我们提出了一种使用基于DCT的损失,在不需要多尺度结构的情况下,保留纹理并抑制恢复图像中的环形伪影的方法,称为“DeblurDCTGAN”。利用DCT对去模糊图像和真实图像进行频域比较。因此,DeblurDCTGAN可以在保持去模糊性能的同时减少块噪声或振铃伪影。实验结果表明,在GoPro、DVD、NFS和HIDE测试数据集中,与其他常规方法相比,DeblurDCTGAN在PSNR和SSIM上都取得了最高的性能。同时,每对DeblurDCTGAN的运行时间也比其他算法快。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信