{"title":"Application of SAR-Optical fusion to extract shoreline position from Cloud-Contaminated satellite images","authors":"Yongjing Mao, Kristen D. Splinter","doi":"10.1016/j.isprsjprs.2025.01.013","DOIUrl":null,"url":null,"abstract":"<div><div>Shorelines derived from optical satellite images are increasingly being used for regional to global scale analysis of sandy coastline dynamics. The optical satellite record, however, is contaminated by cloud cover, which can substantially reduce the temporal resolution of available images for shoreline analysis. Meanwhile, with the development of deep learning methods, optical images are increasingly fused with Synthetic Aperture Radar (SAR) images that are unaffected by clouds to reconstruct the cloud-contaminated pixels. Such SAR-Optical fusion methods have been shown successful for different land surface applications, but the unique characteristics of coastal areas make the applicability of this method unknown in these dynamic zones.</div><div>Herein we apply a deep internal learning (DIL) method to reconstruct cloud-contaminated optical images and explore its applicability to retrieve shorelines obscured by clouds. Our approach uses a mixed sequence of SAR and Gaussian noise images as the prior and the cloudy Modified Normalized Difference Water Index (MNDWI) as the target. The DIL encodes the target with priors and synthesizes plausible pixels under cloud cover. A unique aspect of our workflow is the inclusion of Gaussian noise in the prior sequence for MNDWI images when SAR images collected within a 1-day temporal lag are not available. A novel loss function of DIL model is also introduced to optimize the image reconstruction near the shoreline. These new developments have significant contribution to the model accuracy.</div><div>The DIL method is tested at four different sites with varying tide, wave, and shoreline dynamics. Shorelines derived from the reconstructed and true MNDWI images are compared to quantify the internal accuracy of shoreline reconstruction. For microtidal environments with mean springs tidal range less than 2 m, the mean absolute error (MAE) of shoreline reconstruction is less than 7.5 m with the coefficient of determination (<span><math><mrow><msup><mrow><mi>R</mi></mrow><mn>2</mn></msup></mrow></math></span>) more than 0.78 regardless of shoreline and wave dynamics. The method is less skilful in macro- and mesotidal environments due to the larger water level difference in the paired optical and SAR images, resulting in the MAE of 12.59 m and <span><math><mrow><msup><mrow><mi>R</mi></mrow><mn>2</mn></msup></mrow></math></span> of 0.43. The proposed SAR-Optical fusion method demonstrates substantially better accuracy in retrieving cloud-obscured shoreline positions compared to interpolation methods relying solely on optical images. Results from our work highlight the great potential of SAR-Optical fusion to derive shorelines even under the cloudiest conditions, thus increasing the temporal resolution of shoreline datasets.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"220 ","pages":"Pages 563-579"},"PeriodicalIF":10.6000,"publicationDate":"2025-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625000139","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0
Abstract
Shorelines derived from optical satellite images are increasingly being used for regional to global scale analysis of sandy coastline dynamics. The optical satellite record, however, is contaminated by cloud cover, which can substantially reduce the temporal resolution of available images for shoreline analysis. Meanwhile, with the development of deep learning methods, optical images are increasingly fused with Synthetic Aperture Radar (SAR) images that are unaffected by clouds to reconstruct the cloud-contaminated pixels. Such SAR-Optical fusion methods have been shown successful for different land surface applications, but the unique characteristics of coastal areas make the applicability of this method unknown in these dynamic zones.
Herein we apply a deep internal learning (DIL) method to reconstruct cloud-contaminated optical images and explore its applicability to retrieve shorelines obscured by clouds. Our approach uses a mixed sequence of SAR and Gaussian noise images as the prior and the cloudy Modified Normalized Difference Water Index (MNDWI) as the target. The DIL encodes the target with priors and synthesizes plausible pixels under cloud cover. A unique aspect of our workflow is the inclusion of Gaussian noise in the prior sequence for MNDWI images when SAR images collected within a 1-day temporal lag are not available. A novel loss function of DIL model is also introduced to optimize the image reconstruction near the shoreline. These new developments have significant contribution to the model accuracy.
The DIL method is tested at four different sites with varying tide, wave, and shoreline dynamics. Shorelines derived from the reconstructed and true MNDWI images are compared to quantify the internal accuracy of shoreline reconstruction. For microtidal environments with mean springs tidal range less than 2 m, the mean absolute error (MAE) of shoreline reconstruction is less than 7.5 m with the coefficient of determination () more than 0.78 regardless of shoreline and wave dynamics. The method is less skilful in macro- and mesotidal environments due to the larger water level difference in the paired optical and SAR images, resulting in the MAE of 12.59 m and of 0.43. The proposed SAR-Optical fusion method demonstrates substantially better accuracy in retrieving cloud-obscured shoreline positions compared to interpolation methods relying solely on optical images. Results from our work highlight the great potential of SAR-Optical fusion to derive shorelines even under the cloudiest conditions, thus increasing the temporal resolution of shoreline datasets.
期刊介绍:
The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive.
P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields.
In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.