基于gan的昼夜图像跨域转换研究与应用

Bo-quan Yu, Hanting Wei, Wei Wang
{"title":"基于gan的昼夜图像跨域转换研究与应用","authors":"Bo-quan Yu, Hanting Wei, Wei Wang","doi":"10.1109/ICTech55460.2022.00053","DOIUrl":null,"url":null,"abstract":"With the development and application of deep learning in computer vision, the performance of many basic visual tasks such as object detection and semantic segmentation has been greatly improved. However, most of networks are based on standard illumination, which results in poor performance in low illumination scenarios, and it is difficult to collect datasets with different illumination levels in restricted scenes. In this paper, GAN and related derived networks are systematically studied and summarized, and based on the idea of generation-antagonism of GAN, the design of day-night cross-domain converter is completed on the basis of the structure of CycleGAN. Based on this, Inception layer is added to optimize the structure of the converter, and the performance of the day-night cross-domain converters before and after optimization are compared through experiments. The results show that the optimized day-night converter can make the converted image more realistic. It is of great significance for enhancing the quality of datasets in restricted scenes, improving the performance of object detection and segmentation models in low illumination scenes.","PeriodicalId":290836,"journal":{"name":"2022 11th International Conference of Information and Communication Technology (ICTech))","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-02-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"GAN-Based Day and Night Image Cross-Domain Conversion Research and Application\",\"authors\":\"Bo-quan Yu, Hanting Wei, Wei Wang\",\"doi\":\"10.1109/ICTech55460.2022.00053\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"With the development and application of deep learning in computer vision, the performance of many basic visual tasks such as object detection and semantic segmentation has been greatly improved. However, most of networks are based on standard illumination, which results in poor performance in low illumination scenarios, and it is difficult to collect datasets with different illumination levels in restricted scenes. In this paper, GAN and related derived networks are systematically studied and summarized, and based on the idea of generation-antagonism of GAN, the design of day-night cross-domain converter is completed on the basis of the structure of CycleGAN. Based on this, Inception layer is added to optimize the structure of the converter, and the performance of the day-night cross-domain converters before and after optimization are compared through experiments. The results show that the optimized day-night converter can make the converted image more realistic. It is of great significance for enhancing the quality of datasets in restricted scenes, improving the performance of object detection and segmentation models in low illumination scenes.\",\"PeriodicalId\":290836,\"journal\":{\"name\":\"2022 11th International Conference of Information and Communication Technology (ICTech))\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-02-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 11th International Conference of Information and Communication Technology (ICTech))\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICTech55460.2022.00053\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 11th International Conference of Information and Communication Technology (ICTech))","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTech55460.2022.00053","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

随着深度学习在计算机视觉中的发展和应用,物体检测、语义分割等许多基本视觉任务的性能得到了很大的提高。然而,大多数网络都是基于标准照明,这导致在低照度场景下性能较差,并且在受限场景下难以收集不同照度的数据集。本文对GAN及其衍生网络进行了系统的研究和总结,并基于GAN的生成-对抗思想,在CycleGAN结构的基础上完成了昼夜跨域变换器的设计。在此基础上,增加启梦层对变换器结构进行优化,并通过实验对优化前后的昼夜跨域变换器性能进行比较。结果表明,优化后的昼夜转换器能使转换后的图像更加逼真。这对于提高受限场景下数据集的质量,提高低照度场景下目标检测和分割模型的性能具有重要意义。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
GAN-Based Day and Night Image Cross-Domain Conversion Research and Application
With the development and application of deep learning in computer vision, the performance of many basic visual tasks such as object detection and semantic segmentation has been greatly improved. However, most of networks are based on standard illumination, which results in poor performance in low illumination scenarios, and it is difficult to collect datasets with different illumination levels in restricted scenes. In this paper, GAN and related derived networks are systematically studied and summarized, and based on the idea of generation-antagonism of GAN, the design of day-night cross-domain converter is completed on the basis of the structure of CycleGAN. Based on this, Inception layer is added to optimize the structure of the converter, and the performance of the day-night cross-domain converters before and after optimization are compared through experiments. The results show that the optimized day-night converter can make the converted image more realistic. It is of great significance for enhancing the quality of datasets in restricted scenes, improving the performance of object detection and segmentation models in low illumination scenes.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信