Applying FastPhotoStyle to Synthetic Data for Military Vehicle Detection

Hyeongkeun Lee, Kyungmin Lee, Hunmin Yang, Se-Yoon Oh
{"title":"Applying FastPhotoStyle to Synthetic Data for Military Vehicle Detection","authors":"Hyeongkeun Lee, Kyungmin Lee, Hunmin Yang, Se-Yoon Oh","doi":"10.23919/ICCAS50221.2020.9268331","DOIUrl":null,"url":null,"abstract":"Object detection is one of the main task for the deep learning applications. Deep learning performance has already exceeded human’s detection ability, in the case when there are lots of data for training deep neural networks. In the case of military fields, there are needs to resolve the data shortage problem to employ deep learning system efficiently with benefits. Generating the synthetic data can be a solution, but the domain gap between the synthetic and real data is still an obstacle for training the model. In this paper, we propose a method for decreasing the domain gap by applying style transfer techniques to synthetic data for military vehicle detection. Utilizing FastPhotoStyle to the synthetic data aids efficiently improving the accuracy of object detection when the real data is insufficiency for training. Specifically, we show that stylization which enables artificial data more realistic diminishes the domain gap by evaluating the visualization of their distributions using principal component analysis and Fréchet inception distance score. As a result, the performance has been improved about 8% in the AP@50 metric for stylized synthetic data.","PeriodicalId":6732,"journal":{"name":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","volume":"12 1","pages":"137-140"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 20th International Conference on Control, Automation and Systems (ICCAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/ICCAS50221.2020.9268331","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Object detection is one of the main task for the deep learning applications. Deep learning performance has already exceeded human’s detection ability, in the case when there are lots of data for training deep neural networks. In the case of military fields, there are needs to resolve the data shortage problem to employ deep learning system efficiently with benefits. Generating the synthetic data can be a solution, but the domain gap between the synthetic and real data is still an obstacle for training the model. In this paper, we propose a method for decreasing the domain gap by applying style transfer techniques to synthetic data for military vehicle detection. Utilizing FastPhotoStyle to the synthetic data aids efficiently improving the accuracy of object detection when the real data is insufficiency for training. Specifically, we show that stylization which enables artificial data more realistic diminishes the domain gap by evaluating the visualization of their distributions using principal component analysis and Fréchet inception distance score. As a result, the performance has been improved about 8% in the AP@50 metric for stylized synthetic data.
FastPhotoStyle在军用车辆检测合成数据中的应用
目标检测是深度学习应用的主要任务之一。在训练深度神经网络需要大量数据的情况下,深度学习的性能已经超出了人类的检测能力。在军事领域,需要解决数据短缺的问题,才能高效、效益地运用深度学习系统。生成合成数据是一种解决方案,但合成数据与真实数据之间的领域差距仍然是训练模型的障碍。在本文中,我们提出了一种将风格转移技术应用于军用车辆检测合成数据的方法来减小域间隙。在真实数据训练不足的情况下,将FastPhotoStyle应用于合成数据,可以有效地提高目标检测的准确性。具体来说,我们表明,通过使用主成分分析和fr起始距离分数评估其分布的可视化,风格化使人工数据更加真实,从而减少了领域差距。因此,在程式化合成数据的AP@50指标中,性能提高了大约8%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信