Shengju Yu, Tiansong Li, Xiaoyu Xu, Hao Tao, Li Yu, Yixuan Wang
{"title":"NRQQA","authors":"Shengju Yu, Tiansong Li, Xiaoyu Xu, Hao Tao, Li Yu, Yixuan Wang","doi":"10.1145/3338533.3366563","DOIUrl":null,"url":null,"abstract":"Image stitching technology has been widely used in immersive applications, such as 3D modeling, VR and AR. The quality of stitching results is crucial. At present, the objective quality assessment methods of stitched images are mainly based on the availability of ground truth (i.e., Full-Reference). However, in most cases, ground truth is unavailable. In this paper, a no-reference quality assessment metric specifically designed for stitched images is proposed. We first find out the corresponding parts of source images in the stitched image. Then, the isolated points and the outer points generated by spherical projection are eliminated. After that, we take advantage of the bounding rectangle of stitching seams to locate the position of overlapping regions in the stitched image. Finally, the assessment of overlapping regions is taken as the final scoring result. Extensive experiments have shown that our scores are consistent with human vision. Even for the nuances that cannot be distinguished by human eyes, our proposed metric is also effective.","PeriodicalId":273086,"journal":{"name":"Proceedings of the ACM Multimedia Asia","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the ACM Multimedia Asia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3338533.3366563","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Image stitching technology has been widely used in immersive applications, such as 3D modeling, VR and AR. The quality of stitching results is crucial. At present, the objective quality assessment methods of stitched images are mainly based on the availability of ground truth (i.e., Full-Reference). However, in most cases, ground truth is unavailable. In this paper, a no-reference quality assessment metric specifically designed for stitched images is proposed. We first find out the corresponding parts of source images in the stitched image. Then, the isolated points and the outer points generated by spherical projection are eliminated. After that, we take advantage of the bounding rectangle of stitching seams to locate the position of overlapping regions in the stitched image. Finally, the assessment of overlapping regions is taken as the final scoring result. Extensive experiments have shown that our scores are consistent with human vision. Even for the nuances that cannot be distinguished by human eyes, our proposed metric is also effective.