Coupled Adversarial Learning for Single Image Super-Resolution

Chih-Chung Hsu, Kuan Huang
{"title":"Coupled Adversarial Learning for Single Image Super-Resolution","authors":"Chih-Chung Hsu, Kuan Huang","doi":"10.1109/SAM48682.2020.9104288","DOIUrl":null,"url":null,"abstract":"Generative adversarial nets (GAN) have been widely used in several image restoration tasks such as image denoise, enhancement, and super-resolution. The objective functions of an image super-resolution problem based on GANs usually are reconstruction error, semantic feature distance, and GAN loss. In general, semantic feature distance was used to measure the feature similarity between the super-resolved and ground-truth images, to ensure they have similar feature representations. However, the feature is usually extracted by the pre-trained model, in which the feature representation is not designed for distinguishing the extracted features from low-resolution and high-resolution images. In this study, a coupled adversarial net (CAN) based on Siamese Network Structure is proposed, to improve the effectiveness of the feature extraction. In the proposed CAN, we offer GAN loss and semantic feature distances simultaneously, reducing the training complexity as well as improving the performance. Extensive experiments conducted that the proposed CAN is effective and efficient, compared to state-of-the-art image super-resolution schemes.","PeriodicalId":6753,"journal":{"name":"2020 IEEE 11th Sensor Array and Multichannel Signal Processing Workshop (SAM)","volume":"35 1","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE 11th Sensor Array and Multichannel Signal Processing Workshop (SAM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAM48682.2020.9104288","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Generative adversarial nets (GAN) have been widely used in several image restoration tasks such as image denoise, enhancement, and super-resolution. The objective functions of an image super-resolution problem based on GANs usually are reconstruction error, semantic feature distance, and GAN loss. In general, semantic feature distance was used to measure the feature similarity between the super-resolved and ground-truth images, to ensure they have similar feature representations. However, the feature is usually extracted by the pre-trained model, in which the feature representation is not designed for distinguishing the extracted features from low-resolution and high-resolution images. In this study, a coupled adversarial net (CAN) based on Siamese Network Structure is proposed, to improve the effectiveness of the feature extraction. In the proposed CAN, we offer GAN loss and semantic feature distances simultaneously, reducing the training complexity as well as improving the performance. Extensive experiments conducted that the proposed CAN is effective and efficient, compared to state-of-the-art image super-resolution schemes.
单幅图像超分辨率的耦合对抗学习
生成对抗网络(GAN)已广泛应用于图像去噪、增强和超分辨率等图像恢复任务中。基于GAN的图像超分辨率问题的目标函数通常是重构误差、语义特征距离和GAN损失。通常使用语义特征距离来度量超分辨图像和真地图像之间的特征相似度,以确保它们具有相似的特征表示。然而,特征提取通常是通过预训练模型进行的,在预训练模型中,特征表示没有设计用于区分提取的特征与低分辨率和高分辨率图像。为了提高特征提取的有效性,本文提出了一种基于Siamese网络结构的耦合对抗网络(CAN)。在提出的CAN中,我们同时提供GAN损失和语义特征距离,降低了训练复杂度并提高了性能。大量的实验表明,与最先进的图像超分辨率方案相比,所提出的CAN是有效和高效的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信