Bikash Meher, S. Agrawal, Rutuparna Panda, A. Abraham
{"title":"一种基于区域的各向异性扩散遥感图像融合方法","authors":"Bikash Meher, S. Agrawal, Rutuparna Panda, A. Abraham","doi":"10.1080/19479832.2021.2019132","DOIUrl":null,"url":null,"abstract":"ABSTRACT The aim of remote sensing image fusion is to merge the high spectral resolution multispectral (MS) image with high spatial resolution panchromatic (PAN) image to get a high spatial resolution MS image with less spectral distortion. The conventional pixel level fusion techniques suffer from the halo effect and gradient reversal. To solve this problem, a new region-based method using anisotropic diffusion (AD) for remote sensing image fusion is investigated. The basic idea is to fuse the ‘Y’ component only (of YCbCr colour space) of the MS image with the PAN image. The base layers and detail layers of the input images obtained using the AD process are segmented using the fuzzy c-means (FCM) algorithm and combined based on their spatial frequency. The fusion experiment uses three data sets. The contributions of this paper are as follows: i) it solves the chromaticity loss problem at the time of fusion, ii) the AD filter with the region-based fusion approach is brought into the context of remote sensing application for the first time, and iii) the edge info in the input images is retained. A qualitative and quantitative comparison is made with classic and recent state-of-the-art methods. The experimental results reveal that the proposed method produces promising fusion results.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2021-12-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A region based remote sensing image fusion using anisotropic diffusion process\",\"authors\":\"Bikash Meher, S. Agrawal, Rutuparna Panda, A. Abraham\",\"doi\":\"10.1080/19479832.2021.2019132\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"ABSTRACT The aim of remote sensing image fusion is to merge the high spectral resolution multispectral (MS) image with high spatial resolution panchromatic (PAN) image to get a high spatial resolution MS image with less spectral distortion. The conventional pixel level fusion techniques suffer from the halo effect and gradient reversal. To solve this problem, a new region-based method using anisotropic diffusion (AD) for remote sensing image fusion is investigated. The basic idea is to fuse the ‘Y’ component only (of YCbCr colour space) of the MS image with the PAN image. The base layers and detail layers of the input images obtained using the AD process are segmented using the fuzzy c-means (FCM) algorithm and combined based on their spatial frequency. The fusion experiment uses three data sets. The contributions of this paper are as follows: i) it solves the chromaticity loss problem at the time of fusion, ii) the AD filter with the region-based fusion approach is brought into the context of remote sensing application for the first time, and iii) the edge info in the input images is retained. A qualitative and quantitative comparison is made with classic and recent state-of-the-art methods. The experimental results reveal that the proposed method produces promising fusion results.\",\"PeriodicalId\":46012,\"journal\":{\"name\":\"International Journal of Image and Data Fusion\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.8000,\"publicationDate\":\"2021-12-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Image and Data Fusion\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1080/19479832.2021.2019132\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"REMOTE SENSING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2021.2019132","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
A region based remote sensing image fusion using anisotropic diffusion process
ABSTRACT The aim of remote sensing image fusion is to merge the high spectral resolution multispectral (MS) image with high spatial resolution panchromatic (PAN) image to get a high spatial resolution MS image with less spectral distortion. The conventional pixel level fusion techniques suffer from the halo effect and gradient reversal. To solve this problem, a new region-based method using anisotropic diffusion (AD) for remote sensing image fusion is investigated. The basic idea is to fuse the ‘Y’ component only (of YCbCr colour space) of the MS image with the PAN image. The base layers and detail layers of the input images obtained using the AD process are segmented using the fuzzy c-means (FCM) algorithm and combined based on their spatial frequency. The fusion experiment uses three data sets. The contributions of this paper are as follows: i) it solves the chromaticity loss problem at the time of fusion, ii) the AD filter with the region-based fusion approach is brought into the context of remote sensing application for the first time, and iii) the edge info in the input images is retained. A qualitative and quantitative comparison is made with classic and recent state-of-the-art methods. The experimental results reveal that the proposed method produces promising fusion results.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).