Yan Zhang;Rongbo Fan;PeiPei Duan;Jinfang Dong;Zhiyong Lei
{"title":"DCDGAN-STF: A Multiscale Deformable Convolution Distillation GAN for Remote Sensing Image Spatiotemporal Fusion","authors":"Yan Zhang;Rongbo Fan;PeiPei Duan;Jinfang Dong;Zhiyong Lei","doi":"10.1109/JSTARS.2024.3476153","DOIUrl":null,"url":null,"abstract":"Remote sensing image spatiotemporal fusion (STF) aims to generate composite images with high-temporal and spatial resolutions by combining remote sensing images captured at different times and with different spatial resolutions (DTDS). Among the existing fusion algorithms, deep learning-based fusion models have demonstrated outstanding performance. These models treat STF as an image super-resolution problem based on multiple reference images. However, compared to traditional image super-resolution tasks, remote sensing image STF involves merging a larger amount of multitemporal data with greater resolution difference. To enhance the robust matching performance of spatiotemporal transformations between multiple sets of remote sensing images captured at DTDS and to generate super-resolution composite images, we propose a feature fusion network called the multiscale deformable convolution distillation generative adversarial network (DCDGAN-STF). Specifically, to address the differences in multitemporal data, we introduce a pyramid cascading deformable encoder to identify disparities in multitemporal images. In addition, to address the differences in spatial resolution, we propose a teacher–student correlation distillation method. This method uses the texture details' disparities between high-resolution multitemporal images to guide the extraction of disparities in blurred low-resolution multitemporal images. We comprehensively compared the proposed DCDGAN-STF with some state-of-the-art algorithms on two landsat and moderate-resolution imaging spectroradiometer datasets. Ablation experiments were also conducted to test the effectiveness of different submodules within DCDGAN-STF. The experimental results and ablation analysis demonstrate that our algorithm achieves superior performance compared to other algorithms.","PeriodicalId":13116,"journal":{"name":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","volume":"17 ","pages":"19436-19450"},"PeriodicalIF":4.7000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10707182","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10707182/","RegionNum":2,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Remote sensing image spatiotemporal fusion (STF) aims to generate composite images with high-temporal and spatial resolutions by combining remote sensing images captured at different times and with different spatial resolutions (DTDS). Among the existing fusion algorithms, deep learning-based fusion models have demonstrated outstanding performance. These models treat STF as an image super-resolution problem based on multiple reference images. However, compared to traditional image super-resolution tasks, remote sensing image STF involves merging a larger amount of multitemporal data with greater resolution difference. To enhance the robust matching performance of spatiotemporal transformations between multiple sets of remote sensing images captured at DTDS and to generate super-resolution composite images, we propose a feature fusion network called the multiscale deformable convolution distillation generative adversarial network (DCDGAN-STF). Specifically, to address the differences in multitemporal data, we introduce a pyramid cascading deformable encoder to identify disparities in multitemporal images. In addition, to address the differences in spatial resolution, we propose a teacher–student correlation distillation method. This method uses the texture details' disparities between high-resolution multitemporal images to guide the extraction of disparities in blurred low-resolution multitemporal images. We comprehensively compared the proposed DCDGAN-STF with some state-of-the-art algorithms on two landsat and moderate-resolution imaging spectroradiometer datasets. Ablation experiments were also conducted to test the effectiveness of different submodules within DCDGAN-STF. The experimental results and ablation analysis demonstrate that our algorithm achieves superior performance compared to other algorithms.
期刊介绍:
The IEEE Journal of Selected Topics in Applied Earth Observations and Remote Sensing addresses the growing field of applications in Earth observations and remote sensing, and also provides a venue for the rapidly expanding special issues that are being sponsored by the IEEE Geosciences and Remote Sensing Society. The journal draws upon the experience of the highly successful “IEEE Transactions on Geoscience and Remote Sensing” and provide a complementary medium for the wide range of topics in applied earth observations. The ‘Applications’ areas encompasses the societal benefit areas of the Global Earth Observations Systems of Systems (GEOSS) program. Through deliberations over two years, ministers from 50 countries agreed to identify nine areas where Earth observation could positively impact the quality of life and health of their respective countries. Some of these are areas not traditionally addressed in the IEEE context. These include biodiversity, health and climate. Yet it is the skill sets of IEEE members, in areas such as observations, communications, computers, signal processing, standards and ocean engineering, that form the technical underpinnings of GEOSS. Thus, the Journal attracts a broad range of interests that serves both present members in new ways and expands the IEEE visibility into new areas.