S. Mehravar, Farzaneh Dadras Javan, F. Samadzadegan, A. Toosi, Armin Moghimi, Reza Khatami, A. Stein
{"title":"Varying weighted spatial quality assessment for high resolution satellite image pan-sharpening","authors":"S. Mehravar, Farzaneh Dadras Javan, F. Samadzadegan, A. Toosi, Armin Moghimi, Reza Khatami, A. Stein","doi":"10.1080/19479832.2021.1921059","DOIUrl":null,"url":null,"abstract":"ABSTRACT This paper focuses on spatial quality assessment of pan-sharpened imagery that contains valuable information of input images. Its aim is to show that fusion functions respond differently to different types of landscapes. It compares a quality assessment of an object-level procedure with that of a conventional pixel-level-based procedure which assigns uniform quality scores to all image pixels of pan-sharpened images. To do so, after performing a series of pan-sharpening evaluations, a weighted procedure for spatial quality assessments of pan-sharpening products, allocating spatially varying weight factors to the image pixels proportional to their level of spatial information content is proposed. All experiments are performed using five high-resolution image datasets using fusion products produced by three common pan-sharpening algorithms. The datasets are acquired from WorldView-2, QuickBird, and IKONOS. Experimental results show that the spatial distortion of fused images for the class vegetation cover exceeds that of man-made structures, reaching more than 4% in some cases. Our procedure can preclude illogical fidelity estimations occurring when pan-sharpened images contain different land covers. Since particular image structures are of high importance in remote sensing applications, our procedure provides a purpose-oriented estimation of the spatial quality for pan-sharpened images in comparison with conventional procedures.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"13 1","pages":"44 - 70"},"PeriodicalIF":1.8000,"publicationDate":"2021-05-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/19479832.2021.1921059","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2021.1921059","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 2
Abstract
ABSTRACT This paper focuses on spatial quality assessment of pan-sharpened imagery that contains valuable information of input images. Its aim is to show that fusion functions respond differently to different types of landscapes. It compares a quality assessment of an object-level procedure with that of a conventional pixel-level-based procedure which assigns uniform quality scores to all image pixels of pan-sharpened images. To do so, after performing a series of pan-sharpening evaluations, a weighted procedure for spatial quality assessments of pan-sharpening products, allocating spatially varying weight factors to the image pixels proportional to their level of spatial information content is proposed. All experiments are performed using five high-resolution image datasets using fusion products produced by three common pan-sharpening algorithms. The datasets are acquired from WorldView-2, QuickBird, and IKONOS. Experimental results show that the spatial distortion of fused images for the class vegetation cover exceeds that of man-made structures, reaching more than 4% in some cases. Our procedure can preclude illogical fidelity estimations occurring when pan-sharpened images contain different land covers. Since particular image structures are of high importance in remote sensing applications, our procedure provides a purpose-oriented estimation of the spatial quality for pan-sharpened images in comparison with conventional procedures.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).