Yang Lu , Zifan Yang , Zilu Zhou , Gaowei Zhang , Xiaoheng Jiang , Mingliang Xu
{"title":"从感知角度看:基于双感知混合网络的无参考图像质量评估","authors":"Yang Lu , Zifan Yang , Zilu Zhou , Gaowei Zhang , Xiaoheng Jiang , Mingliang Xu","doi":"10.1016/j.patrec.2025.04.035","DOIUrl":null,"url":null,"abstract":"<div><div>The goal of the No-Reference Image Quality Assessment is to simulate human perception of image quality without the availability of a reference image. Previous research has largely focused on extracting content features from distorted images, perceiving distortion types pixel-by-pixel or in blocks while neglecting the establishment of a relationship between the quality of distorted and reference images. To address this issue, this paper proposes a Dual Perception Hybrid Network (DPHN), where dual perception refers to the parallel extraction of quality and content features. Quality perception involves constructing a quality relationship by leveraging the difference between the features of the distorted image and the reconstructed image, while content perception focuses on learning the content information of the distortion itself from the distorted image. To demonstrate the effectiveness of the proposed Dual Perception Fusion Network, we utilised four representative IQA datasets. Extensive experimental results show that the proposed network exhibits promising performance. Our code will be available at <span><span>https://github.com/YZFzzu/DPHN</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":54638,"journal":{"name":"Pattern Recognition Letters","volume":"194 ","pages":"Pages 55-61"},"PeriodicalIF":3.9000,"publicationDate":"2025-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"From a perceptual perspective: No-Reference Image Quality Assessment using Dual Perception Hybrid Network\",\"authors\":\"Yang Lu , Zifan Yang , Zilu Zhou , Gaowei Zhang , Xiaoheng Jiang , Mingliang Xu\",\"doi\":\"10.1016/j.patrec.2025.04.035\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The goal of the No-Reference Image Quality Assessment is to simulate human perception of image quality without the availability of a reference image. Previous research has largely focused on extracting content features from distorted images, perceiving distortion types pixel-by-pixel or in blocks while neglecting the establishment of a relationship between the quality of distorted and reference images. To address this issue, this paper proposes a Dual Perception Hybrid Network (DPHN), where dual perception refers to the parallel extraction of quality and content features. Quality perception involves constructing a quality relationship by leveraging the difference between the features of the distorted image and the reconstructed image, while content perception focuses on learning the content information of the distortion itself from the distorted image. To demonstrate the effectiveness of the proposed Dual Perception Fusion Network, we utilised four representative IQA datasets. Extensive experimental results show that the proposed network exhibits promising performance. Our code will be available at <span><span>https://github.com/YZFzzu/DPHN</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":54638,\"journal\":{\"name\":\"Pattern Recognition Letters\",\"volume\":\"194 \",\"pages\":\"Pages 55-61\"},\"PeriodicalIF\":3.9000,\"publicationDate\":\"2025-05-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pattern Recognition Letters\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0167865525001771\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition Letters","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0167865525001771","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
From a perceptual perspective: No-Reference Image Quality Assessment using Dual Perception Hybrid Network
The goal of the No-Reference Image Quality Assessment is to simulate human perception of image quality without the availability of a reference image. Previous research has largely focused on extracting content features from distorted images, perceiving distortion types pixel-by-pixel or in blocks while neglecting the establishment of a relationship between the quality of distorted and reference images. To address this issue, this paper proposes a Dual Perception Hybrid Network (DPHN), where dual perception refers to the parallel extraction of quality and content features. Quality perception involves constructing a quality relationship by leveraging the difference between the features of the distorted image and the reconstructed image, while content perception focuses on learning the content information of the distortion itself from the distorted image. To demonstrate the effectiveness of the proposed Dual Perception Fusion Network, we utilised four representative IQA datasets. Extensive experimental results show that the proposed network exhibits promising performance. Our code will be available at https://github.com/YZFzzu/DPHN.
期刊介绍:
Pattern Recognition Letters aims at rapid publication of concise articles of a broad interest in pattern recognition.
Subject areas include all the current fields of interest represented by the Technical Committees of the International Association of Pattern Recognition, and other developing themes involving learning and recognition.