Seg-CycleGAN:由下游任务引导的合成孔径雷达图像到光学图像的转换

Hannuo Zhang;Huihui Li;Jiarui Lin;Yujie Zhang;Jianghua Fan;Hang Liu;Kun Liu
{"title":"Seg-CycleGAN:由下游任务引导的合成孔径雷达图像到光学图像的转换","authors":"Hannuo Zhang;Huihui Li;Jiarui Lin;Yujie Zhang;Jianghua Fan;Hang Liu;Kun Liu","doi":"10.1109/LGRS.2025.3538868","DOIUrl":null,"url":null,"abstract":"Optical remote sensing and synthetic aperture radar (SAR) remote sensing are crucial for earth observation, offering complementary capabilities. While optical sensors provide high-quality images, they are limited by weather and lighting conditions. In contrast, SAR sensors can operate effectively under adverse conditions. This letter proposes a generative adversarial network (GAN)-based SAR-to-optical image translation method named Seg-CycleGAN, designed to enhance the accuracy of ship target translation by leveraging semantic information from a pretrained semantic segmentation model. Our method utilizes the downstream task of ship target semantic segmentation to guide the training of the image translation network, improving the quality of output optical-styled images. The potential of foundation-model-annotated datasets in SAR-to-optical translation tasks is revealed. This work suggests broader research and applications for downstream-task-guided frameworks. The code and link to download the proposed HRSID-DIOR dataset will be available at <uri>https://github.com/NPULHH/</uri>.","PeriodicalId":91017,"journal":{"name":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","volume":"22 ","pages":"1-5"},"PeriodicalIF":0.0000,"publicationDate":"2025-02-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Seg-CycleGAN: SAR-to-Optical Image Translation Guided by a Downstream Task\",\"authors\":\"Hannuo Zhang;Huihui Li;Jiarui Lin;Yujie Zhang;Jianghua Fan;Hang Liu;Kun Liu\",\"doi\":\"10.1109/LGRS.2025.3538868\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Optical remote sensing and synthetic aperture radar (SAR) remote sensing are crucial for earth observation, offering complementary capabilities. While optical sensors provide high-quality images, they are limited by weather and lighting conditions. In contrast, SAR sensors can operate effectively under adverse conditions. This letter proposes a generative adversarial network (GAN)-based SAR-to-optical image translation method named Seg-CycleGAN, designed to enhance the accuracy of ship target translation by leveraging semantic information from a pretrained semantic segmentation model. Our method utilizes the downstream task of ship target semantic segmentation to guide the training of the image translation network, improving the quality of output optical-styled images. The potential of foundation-model-annotated datasets in SAR-to-optical translation tasks is revealed. This work suggests broader research and applications for downstream-task-guided frameworks. The code and link to download the proposed HRSID-DIOR dataset will be available at <uri>https://github.com/NPULHH/</uri>.\",\"PeriodicalId\":91017,\"journal\":{\"name\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"volume\":\"22 \",\"pages\":\"1-5\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-02-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10872937/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE geoscience and remote sensing letters : a publication of the IEEE Geoscience and Remote Sensing Society","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10872937/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

光学遥感和合成孔径雷达(SAR)遥感对地球观测至关重要,两者具有互补功能。虽然光学传感器能提供高质量的图像,但它们受到天气和光照条件的限制。相比之下,合成孔径雷达传感器可以在恶劣条件下有效工作。这封信提出了一种基于生成对抗网络(GAN)的合成孔径雷达到光学图像转换方法,名为 Seg-CycleGAN,旨在通过利用来自预训练语义分割模型的语义信息来提高船舶目标转换的准确性。我们的方法利用船舶目标语义分割的下游任务来指导图像翻译网络的训练,从而提高输出光学风格图像的质量。基础模型标注数据集在合成孔径雷达到光学转换任务中的潜力得到了揭示。这项工作为下游任务引导框架提出了更广泛的研究和应用建议。拟议的 HRSID-DIOR 数据集的代码和下载链接将发布在 https://github.com/NPULHH/ 网站上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Seg-CycleGAN: SAR-to-Optical Image Translation Guided by a Downstream Task
Optical remote sensing and synthetic aperture radar (SAR) remote sensing are crucial for earth observation, offering complementary capabilities. While optical sensors provide high-quality images, they are limited by weather and lighting conditions. In contrast, SAR sensors can operate effectively under adverse conditions. This letter proposes a generative adversarial network (GAN)-based SAR-to-optical image translation method named Seg-CycleGAN, designed to enhance the accuracy of ship target translation by leveraging semantic information from a pretrained semantic segmentation model. Our method utilizes the downstream task of ship target semantic segmentation to guide the training of the image translation network, improving the quality of output optical-styled images. The potential of foundation-model-annotated datasets in SAR-to-optical translation tasks is revealed. This work suggests broader research and applications for downstream-task-guided frameworks. The code and link to download the proposed HRSID-DIOR dataset will be available at https://github.com/NPULHH/.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信