Ahmad Toosi , Farhad Samadzadegan , Farzaneh Dadrass Javan
{"title":"Toward the optimal spatial resolution ratio for fusion of UAV and Sentinel-2 satellite imageries using metaheuristic optimization","authors":"Ahmad Toosi , Farhad Samadzadegan , Farzaneh Dadrass Javan","doi":"10.1016/j.asr.2025.02.019","DOIUrl":null,"url":null,"abstract":"<div><div>Sentinel-2A/2B twin satellites provide multispectral imagery every 5 days at a medium spatial resolution (10 m for visible and near-infrared bands). In contrast, UAV photogrammetry with low-cost visible-light (RGB) sensors produces ultra-high-resolution orthomosaics but lacks rich spectral information. Fusing these datasets is a solution to enhance the resolution of Sentinel imagery using UAV data. However, fusion is challenging due to large spatial resolution disparities, particularly in Ground Sampling Distance (GSD). By challenging the common practice of sharpening the Sentinel image to match the UAV resolution, we propose a method that fuses the two images at an intermediate resolution level where their information content is comparable. Our approach uses natural target edge analysis and a Genetic-based metaheuristic optimization technique. By minimizing an objective function comprising true GSD or Ground Resolved Distance (GRD) and Mutual Information (MI), we determine the optimal resolution level for fusion. Experimental validation on two UAV and Sentinel datasets, yielded optimal GRD estimates of 2.35 m and 2.03 m, respectively, with Sentinel GRD values of 12.12 m and 12.39 m. The optimal UAV-Sentinel GRD ratios were 0.193 and 0.164. The sharpened Sentinel images showed efficient fusion through subjective and objective quality assessments. Testing the method on 24 UAV-Sentinel datasets from various regions, including America (United States), Europe (Germany, Spain, and Switzerland), and Asia (Iran and Qatar), demonstrated its robustness across different land covers and sensor types. This approach can be applied to any multi-sensor remote sensing image fusion task with significant resolution differences, establishing the meaningful level for fusion.</div></div>","PeriodicalId":50850,"journal":{"name":"Advances in Space Research","volume":"75 7","pages":"Pages 5254-5282"},"PeriodicalIF":2.8000,"publicationDate":"2025-02-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Advances in Space Research","FirstCategoryId":"89","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0273117725001383","RegionNum":3,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ASTRONOMY & ASTROPHYSICS","Score":null,"Total":0}
引用次数: 0
Abstract
Sentinel-2A/2B twin satellites provide multispectral imagery every 5 days at a medium spatial resolution (10 m for visible and near-infrared bands). In contrast, UAV photogrammetry with low-cost visible-light (RGB) sensors produces ultra-high-resolution orthomosaics but lacks rich spectral information. Fusing these datasets is a solution to enhance the resolution of Sentinel imagery using UAV data. However, fusion is challenging due to large spatial resolution disparities, particularly in Ground Sampling Distance (GSD). By challenging the common practice of sharpening the Sentinel image to match the UAV resolution, we propose a method that fuses the two images at an intermediate resolution level where their information content is comparable. Our approach uses natural target edge analysis and a Genetic-based metaheuristic optimization technique. By minimizing an objective function comprising true GSD or Ground Resolved Distance (GRD) and Mutual Information (MI), we determine the optimal resolution level for fusion. Experimental validation on two UAV and Sentinel datasets, yielded optimal GRD estimates of 2.35 m and 2.03 m, respectively, with Sentinel GRD values of 12.12 m and 12.39 m. The optimal UAV-Sentinel GRD ratios were 0.193 and 0.164. The sharpened Sentinel images showed efficient fusion through subjective and objective quality assessments. Testing the method on 24 UAV-Sentinel datasets from various regions, including America (United States), Europe (Germany, Spain, and Switzerland), and Asia (Iran and Qatar), demonstrated its robustness across different land covers and sensor types. This approach can be applied to any multi-sensor remote sensing image fusion task with significant resolution differences, establishing the meaningful level for fusion.
期刊介绍:
The COSPAR publication Advances in Space Research (ASR) is an open journal covering all areas of space research including: space studies of the Earth''s surface, meteorology, climate, the Earth-Moon system, planets and small bodies of the solar system, upper atmospheres, ionospheres and magnetospheres of the Earth and planets including reference atmospheres, space plasmas in the solar system, astrophysics from space, materials sciences in space, fundamental physics in space, space debris, space weather, Earth observations of space phenomena, etc.
NB: Please note that manuscripts related to life sciences as related to space are no more accepted for submission to Advances in Space Research. Such manuscripts should now be submitted to the new COSPAR Journal Life Sciences in Space Research (LSSR).
All submissions are reviewed by two scientists in the field. COSPAR is an interdisciplinary scientific organization concerned with the progress of space research on an international scale. Operating under the rules of ICSU, COSPAR ignores political considerations and considers all questions solely from the scientific viewpoint.