Meng Chao;Wang Mengjie;Shi Wenxiu;Zhu Huiping;Zhang Song;Zhang Rui;Yang Ming
{"title":"AutoSelecter: Efficient Synthetic Nighttime Images Make Object Detector Stronger","authors":"Meng Chao;Wang Mengjie;Shi Wenxiu;Zhu Huiping;Zhang Song;Zhang Rui;Yang Ming","doi":"10.1109/LRA.2025.3552996","DOIUrl":null,"url":null,"abstract":"Object detection has achieved significant advancements despite the challenges posed by adverse conditions like low-light nighttime environments, where annotated data is not only scarce but also challenging to accurately label. Instead of designing special network, we focus on the creation and efficient utilization of synthetic data to address the problem. We generate synthetic data by employing an enhanced generative model that adeptly transforms daytime images into low-light nighttime ones. Furthermore, we introduce a data selection scheme, named AutoSelecter, which can be flexibly integrated into the training process of object detector, ensuring the selection of the most effective synthetic data. By efficiently utilizing synthetic data, our strategy achieves an average improvement of 5.2% and 6.1% in AP<inline-formula><tex-math>$_{50}$</tex-math></inline-formula> on the nighttime datasets of BDD100k and Waymo, respectively, for the YOLOv7, YOLOv8, and RT-DETR object detectors. We have insightfully discovered numerous missed and mislabeled annotations in manually annotated low-light nighttime datasets, which can significantly interfere with the accuracy of evaluation results during nighttime. Consequently, we also provide a manually annotated and more accurate dataset BDD100kValNight+ for better evaluation. On this refined dataset, our strategy achieves an average improvement of 5.1% in AP<inline-formula><tex-math>$_{50}$</tex-math></inline-formula> on the three detectors.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 5","pages":"4660-4665"},"PeriodicalIF":4.6000,"publicationDate":"2025-03-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10933579/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Object detection has achieved significant advancements despite the challenges posed by adverse conditions like low-light nighttime environments, where annotated data is not only scarce but also challenging to accurately label. Instead of designing special network, we focus on the creation and efficient utilization of synthetic data to address the problem. We generate synthetic data by employing an enhanced generative model that adeptly transforms daytime images into low-light nighttime ones. Furthermore, we introduce a data selection scheme, named AutoSelecter, which can be flexibly integrated into the training process of object detector, ensuring the selection of the most effective synthetic data. By efficiently utilizing synthetic data, our strategy achieves an average improvement of 5.2% and 6.1% in AP$_{50}$ on the nighttime datasets of BDD100k and Waymo, respectively, for the YOLOv7, YOLOv8, and RT-DETR object detectors. We have insightfully discovered numerous missed and mislabeled annotations in manually annotated low-light nighttime datasets, which can significantly interfere with the accuracy of evaluation results during nighttime. Consequently, we also provide a manually annotated and more accurate dataset BDD100kValNight+ for better evaluation. On this refined dataset, our strategy achieves an average improvement of 5.1% in AP$_{50}$ on the three detectors.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.