{"title":"Closing the gap: Visual quality assessment considering viewing conditions","authors":"Yucheng Zhu, Guangtao Zhai, Ke Gu, Zhaohui Che","doi":"10.1109/QoMEX.2016.7498927","DOIUrl":null,"url":null,"abstract":"Most of existing visual quality assessment algorithms are tested on standard databases that are created in controlled viewing conditions (e.g. display device, viewing distance and lighting). This implies that all the recoded subjective scores are only valid for the specific settings used in the database. However, with the prevalence of mobile devices, the practical viewing environments can significantly vary from moment to moment. It is our daily experience that the same image can look drastically different on dissimilar devices under changed viewing distance and/or lighting conditions. In other words, a gap exists between the eyes and the visual contents behind the screen in current research of quality assessment. Therefore, in this work, we perform subjective quality evaluation with varied actual viewing conditions. To make the research reproducible, we build a prototype system to record what the eyes really see from the screen and construct the viewing environment-changed image database. The database will be made available to the public. Meanwhile we design a dedicated effective environment-assessing algorithm. We believe that this work will benefit the research of visual quality assessment towards more practical applications.","PeriodicalId":6645,"journal":{"name":"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)","volume":"122 1","pages":"1-6"},"PeriodicalIF":0.0000,"publicationDate":"2016-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Eighth International Conference on Quality of Multimedia Experience (QoMEX)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/QoMEX.2016.7498927","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Most of existing visual quality assessment algorithms are tested on standard databases that are created in controlled viewing conditions (e.g. display device, viewing distance and lighting). This implies that all the recoded subjective scores are only valid for the specific settings used in the database. However, with the prevalence of mobile devices, the practical viewing environments can significantly vary from moment to moment. It is our daily experience that the same image can look drastically different on dissimilar devices under changed viewing distance and/or lighting conditions. In other words, a gap exists between the eyes and the visual contents behind the screen in current research of quality assessment. Therefore, in this work, we perform subjective quality evaluation with varied actual viewing conditions. To make the research reproducible, we build a prototype system to record what the eyes really see from the screen and construct the viewing environment-changed image database. The database will be made available to the public. Meanwhile we design a dedicated effective environment-assessing algorithm. We believe that this work will benefit the research of visual quality assessment towards more practical applications.