Nguyen D. Vo, Phuc Nguyen, Thang Truong, Hoan C. Nguyen, Khang Nguyen
{"title":"Foggy-DOTA: An adverse weather dataset for object detection in aerial images","authors":"Nguyen D. Vo, Phuc Nguyen, Thang Truong, Hoan C. Nguyen, Khang Nguyen","doi":"10.1109/NICS56915.2022.10013441","DOIUrl":null,"url":null,"abstract":"Nowadays, object detection in aerial images in adverse weather, especially in foggy scenes becoming very challenging and incredibly practical. Furthermore, fog and clouds usually appear in the majority of aerial images captured via drones everywhere on Earth, especially in the early morning. Understanding the need for qualified deep learning approaches, we propose a Foggy-DOTA dataset inheriting from the original DOTA dataset and then empirically evaluate it on multiple State-of-the-art methods. After having conducted lots of experiments on some well-known baselines, ReDet is the highest method achieving 76.680 mAP on the original DOTA, only 74.194 mAP on our Foggy-DOTA dataset (60.706 mAP if trained on DOTA). On the other hand, S2ANet and RoI Transformer achieve 74.190, 76,09 mAP on the original DOTA, only yield 71.629 and 73.381 mAP on our Foggy-DOTA dataset (46.503 and 40.297 mAP - significantly low if trained on DOTA), respectively. Our work provides comprehensive statistical evaluation being an essential baseline for future object detection research.","PeriodicalId":381028,"journal":{"name":"2022 9th NAFOSTED Conference on Information and Computer Science (NICS)","volume":"255 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 9th NAFOSTED Conference on Information and Computer Science (NICS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NICS56915.2022.10013441","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Nowadays, object detection in aerial images in adverse weather, especially in foggy scenes becoming very challenging and incredibly practical. Furthermore, fog and clouds usually appear in the majority of aerial images captured via drones everywhere on Earth, especially in the early morning. Understanding the need for qualified deep learning approaches, we propose a Foggy-DOTA dataset inheriting from the original DOTA dataset and then empirically evaluate it on multiple State-of-the-art methods. After having conducted lots of experiments on some well-known baselines, ReDet is the highest method achieving 76.680 mAP on the original DOTA, only 74.194 mAP on our Foggy-DOTA dataset (60.706 mAP if trained on DOTA). On the other hand, S2ANet and RoI Transformer achieve 74.190, 76,09 mAP on the original DOTA, only yield 71.629 and 73.381 mAP on our Foggy-DOTA dataset (46.503 and 40.297 mAP - significantly low if trained on DOTA), respectively. Our work provides comprehensive statistical evaluation being an essential baseline for future object detection research.