{"title":"Prototypical Distribution Divergence Loss for Image Restoration","authors":"Jialun Peng;Jingjing Fu;Dong Liu","doi":"10.1109/TIP.2025.3572818","DOIUrl":null,"url":null,"abstract":"Neural networks have achieved significant advances in the field of image restoration and much research has focused on designing new architectures for convolutional neural networks (CNNs) and Transformers. The choice of loss functions, despite being a critical factor when training image restoration networks, has attracted little attention. The existing losses are primarily based on semantic or hand-crafted representations. Recently, discrete representations have demonstrated strong capabilities in representing images. In this work, we explore the loss of discrete representations for image restoration. Specifically, we propose a Local Residual Quantized Variational AutoEncoder (Local RQ-VAE) to learn prototype vectors that represent the local details of high-quality images. Then we propose a Prototypical Distribution Divergence (PDD) loss that measures the Kullback-Leibler divergence between the prototypical distributions of the restored and target images. Experimental results demonstrate that our PDD loss improves the restored images in both PSNR and visual quality for state-of-the-art CNNs and Transformers on several image restoration tasks, including image super-resolution, image denoising, image motion deblurring, and defocus deblurring.","PeriodicalId":94032,"journal":{"name":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","volume":"34 ","pages":"3563-3577"},"PeriodicalIF":13.7000,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on image processing : a publication of the IEEE Signal Processing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11018214/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Neural networks have achieved significant advances in the field of image restoration and much research has focused on designing new architectures for convolutional neural networks (CNNs) and Transformers. The choice of loss functions, despite being a critical factor when training image restoration networks, has attracted little attention. The existing losses are primarily based on semantic or hand-crafted representations. Recently, discrete representations have demonstrated strong capabilities in representing images. In this work, we explore the loss of discrete representations for image restoration. Specifically, we propose a Local Residual Quantized Variational AutoEncoder (Local RQ-VAE) to learn prototype vectors that represent the local details of high-quality images. Then we propose a Prototypical Distribution Divergence (PDD) loss that measures the Kullback-Leibler divergence between the prototypical distributions of the restored and target images. Experimental results demonstrate that our PDD loss improves the restored images in both PSNR and visual quality for state-of-the-art CNNs and Transformers on several image restoration tasks, including image super-resolution, image denoising, image motion deblurring, and defocus deblurring.