No-reference Image Quality Assessment Based on Convolutional Neural Network

Yangming Chen, Xiuhua Jiang
{"title":"No-reference Image Quality Assessment Based on Convolutional Neural Network","authors":"Yangming Chen, Xiuhua Jiang","doi":"10.1109/ICCT.2018.8599897","DOIUrl":null,"url":null,"abstract":"Image Quality Assessment (IQA) is a classic research topic, whose goal is to design algorithms and give objective values consistent with subjective values of the human visual system (HVS). IQA plays an important role in many image processing applications, such as image enhancement, image compression and reconstruction, watermark addition, etc. In this paper, we present two no-reference image quality assessment (NR-IQA) models based on convolutional neural networks (CNN). One of the biggest challenges in learning NR-IQA model is lack of images with subjective value. Thus, we label images with full-reference (FR) algorithms, the intermediate product “similarity map” is generated when some FR algorithms calculate image quality. We use similarity map or objective score of some FR algorithms to label images separately. The first model uses objective values of SSIM, VIF, GMSD and FSIM respectively, to label images to train the improved VGGNet. The second model uses similarity map generated by FSIM and VSI respectively, to label images to train the improved U-Net. Experiments conducted on the database D215 built in our laboratory show that our second model is comparable to the advanced NR-IQA model.","PeriodicalId":244952,"journal":{"name":"2018 IEEE 18th International Conference on Communication Technology (ICCT)","volume":"63 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 IEEE 18th International Conference on Communication Technology (ICCT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCT.2018.8599897","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Image Quality Assessment (IQA) is a classic research topic, whose goal is to design algorithms and give objective values consistent with subjective values of the human visual system (HVS). IQA plays an important role in many image processing applications, such as image enhancement, image compression and reconstruction, watermark addition, etc. In this paper, we present two no-reference image quality assessment (NR-IQA) models based on convolutional neural networks (CNN). One of the biggest challenges in learning NR-IQA model is lack of images with subjective value. Thus, we label images with full-reference (FR) algorithms, the intermediate product “similarity map” is generated when some FR algorithms calculate image quality. We use similarity map or objective score of some FR algorithms to label images separately. The first model uses objective values of SSIM, VIF, GMSD and FSIM respectively, to label images to train the improved VGGNet. The second model uses similarity map generated by FSIM and VSI respectively, to label images to train the improved U-Net. Experiments conducted on the database D215 built in our laboratory show that our second model is comparable to the advanced NR-IQA model.
基于卷积神经网络的无参考图像质量评估
图像质量评估(IQA)是一个经典的研究课题,其目标是设计算法并给出与人类视觉系统(HVS)主观值一致的客观值。IQA在图像增强、图像压缩与重构、水印添加等许多图像处理应用中发挥着重要作用。在本文中,我们提出了两个基于卷积神经网络(CNN)的无参考图像质量评估(NR-IQA)模型。学习NR-IQA模型的最大挑战之一是缺乏具有主观价值的图像。因此,我们使用全参考(FR)算法对图像进行标注,一些全参考算法在计算图像质量时生成中间产物“相似性图”。我们使用一些FR算法的相似度图或客观评分对图像进行单独标记。第一个模型分别使用SSIM、VIF、GMSD和FSIM的目标值对图像进行标记,训练改进的VGGNet。第二种模型分别使用FSIM和VSI生成的相似图对图像进行标记,以训练改进的U-Net。在我们实验室建立的数据库D215上进行的实验表明,我们的第二个模型与先进的NR-IQA模型相当。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信