{"title":"Multi-stage guided-filter for SAR and optical satellites images fusion using Curvelet and Gram Schmidt transforms for maritime surveillance","authors":"T. Ghoniemy, M. Hammad, A. Amein, T. Mahmoud","doi":"10.1080/19479832.2021.2003446","DOIUrl":null,"url":null,"abstract":"ABSTRACT Synthetic aperture radar (SAR) images depend on the dielectric properties of objects with certain incident angles. Thus, vessels and other metallic objects appear clear in SAR images however, they are difficult to be distinguished in optical images. Synergy of these two types of images leads to not only high spatial and spectral resolutions but also good explanation of the image scene. In this paper, a hybrid pixel-level image fusion method is proposed for integrating panchromatic (PAN), multispectral (MS) and SAR images. The fusion method is performed using Multi-stage guided filter (MGF) for optical images pansharpening, to get high preserving spatial details and nested Gram-Schmidt (GS) and Curvelet-Transform (CVT) methods for SAR and optical images,to increase the quality of the final fused image and benefit from the SAR image properties. The accuracy and performance of the proposed method are appraised using Landsat-8 Operational-Land-Imager (OLI) and Sentinel-1 images subjectively as well as objectively using different quality metrics. Moreover, the proposed method is compared to a number of state-of-the-art fusion techniques. The results show significant improvements in both visual quality and the spatial and spectral evaluation metrics. Consequently, the proposed method is capable of highlighting maritime activity for further processing.","PeriodicalId":46012,"journal":{"name":"International Journal of Image and Data Fusion","volume":null,"pages":null},"PeriodicalIF":1.8000,"publicationDate":"2021-11-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Image and Data Fusion","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1080/19479832.2021.2003446","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"REMOTE SENSING","Score":null,"Total":0}
引用次数: 2
Abstract
ABSTRACT Synthetic aperture radar (SAR) images depend on the dielectric properties of objects with certain incident angles. Thus, vessels and other metallic objects appear clear in SAR images however, they are difficult to be distinguished in optical images. Synergy of these two types of images leads to not only high spatial and spectral resolutions but also good explanation of the image scene. In this paper, a hybrid pixel-level image fusion method is proposed for integrating panchromatic (PAN), multispectral (MS) and SAR images. The fusion method is performed using Multi-stage guided filter (MGF) for optical images pansharpening, to get high preserving spatial details and nested Gram-Schmidt (GS) and Curvelet-Transform (CVT) methods for SAR and optical images,to increase the quality of the final fused image and benefit from the SAR image properties. The accuracy and performance of the proposed method are appraised using Landsat-8 Operational-Land-Imager (OLI) and Sentinel-1 images subjectively as well as objectively using different quality metrics. Moreover, the proposed method is compared to a number of state-of-the-art fusion techniques. The results show significant improvements in both visual quality and the spatial and spectral evaluation metrics. Consequently, the proposed method is capable of highlighting maritime activity for further processing.
期刊介绍:
International Journal of Image and Data Fusion provides a single source of information for all aspects of image and data fusion methodologies, developments, techniques and applications. Image and data fusion techniques are important for combining the many sources of satellite, airborne and ground based imaging systems, and integrating these with other related data sets for enhanced information extraction and decision making. Image and data fusion aims at the integration of multi-sensor, multi-temporal, multi-resolution and multi-platform image data, together with geospatial data, GIS, in-situ, and other statistical data sets for improved information extraction, as well as to increase the reliability of the information. This leads to more accurate information that provides for robust operational performance, i.e. increased confidence, reduced ambiguity and improved classification enabling evidence based management. The journal welcomes original research papers, review papers, shorter letters, technical articles, book reviews and conference reports in all areas of image and data fusion including, but not limited to, the following aspects and topics: • Automatic registration/geometric aspects of fusing images with different spatial, spectral, temporal resolutions; phase information; or acquired in different modes • Pixel, feature and decision level fusion algorithms and methodologies • Data Assimilation: fusing data with models • Multi-source classification and information extraction • Integration of satellite, airborne and terrestrial sensor systems • Fusing temporal data sets for change detection studies (e.g. for Land Cover/Land Use Change studies) • Image and data mining from multi-platform, multi-source, multi-scale, multi-temporal data sets (e.g. geometric information, topological information, statistical information, etc.).