{"title":"A context-driven pansharpening method using superpixel based texture analysis","authors":"H. Hallabia, H. Hamam, A. Ben Hamida","doi":"10.1080/19479832.2020.1845244","DOIUrl":null,"url":null,"abstract":"ABSTRACT In this paper, we propose a context-driven injection scheme for pansharpening, in which the injection coefficients are computed over superpixel segments obtained by means of a modified Simple Linear Iterative Clustering (t-SLIC) technique applied on the texture descriptors of the PAN image. By using the t-SLIC algorithm, various homogeneous-connected components can be generated according to their spectral properties. The proposed pansharpening method relies on a multiresolution framework by employing the Generalized Laplacian Pyramid (GLP) tailored to the Modulation Transfer Function (MTF) of the MS sensors for extracting the high frequency details. First, the injection gains are locally computed as regression coefficients between the upsampled MS and low-resolution PAN regions at a reduced scale. Then, they are multiplied by a global weighting factor computed per spectral band and defined as the ratio of variance between expanded MS bands and PAN image. Finally, the spatial details are modulated by means of the estimated global-local injection coefficients at superpixel level to produce the high-resolution MS image. The validation is assessed with two datasets acquired by IKONOS and WorldView-3 satellites. The experimental results show that the proposed method achieves a favourable performance both visually and quantitatively compared to the state of-the-art pansharpening algorithms.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":"12 1","pages":"1 - 22"},"PeriodicalIF":1.8000,"publicationDate":"2020-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1080/19479832.2020.1845244","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2020.1845244","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 3
Abstract
ABSTRACT In this paper, we propose a context-driven injection scheme for pansharpening, in which the injection coefficients are computed over superpixel segments obtained by means of a modified Simple Linear Iterative Clustering (t-SLIC) technique applied on the texture descriptors of the PAN image. By using the t-SLIC algorithm, various homogeneous-connected components can be generated according to their spectral properties. The proposed pansharpening method relies on a multiresolution framework by employing the Generalized Laplacian Pyramid (GLP) tailored to the Modulation Transfer Function (MTF) of the MS sensors for extracting the high frequency details. First, the injection gains are locally computed as regression coefficients between the upsampled MS and low-resolution PAN regions at a reduced scale. Then, they are multiplied by a global weighting factor computed per spectral band and defined as the ratio of variance between expanded MS bands and PAN image. Finally, the spatial details are modulated by means of the estimated global-local injection coefficients at superpixel level to produce the high-resolution MS image. The validation is assessed with two datasets acquired by IKONOS and WorldView-3 satellites. The experimental results show that the proposed method achieves a favourable performance both visually and quantitatively compared to the state of-the-art pansharpening algorithms.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).