{"title":"基于雷达增强图像融合的自动驾驶目标检测","authors":"Yaqing Gu, Shiyu Meng, Kun Shi","doi":"10.1109/ICSPCC55723.2022.9984358","DOIUrl":null,"url":null,"abstract":"Accurate and robust object detection is imperative to the implementation of autonomous driving. In real-world scenarios, the effectiveness of image-based detectors is limited by low visibility or harsh circumstances. Owing to the immunity to environmental variability, millimeter-wave (mmWave) radar sensors are complementary to camera sensors, opening up the possibility of radar-camera fusion to improve object detection performance. In this paper, we construct a Radar-Enhanced image Fusion Network (REFNet) for 2D object detection in autonomous driving. Specifically, the radar data is projected onto the camera image plane to unify the data format of heterogeneous sensing modalities. To overcome the sparsity of radar point clouds, we devise an Uncertainty Radar Block (URB) to increase the density of radar points considering the azimuth uncertainty of radar measurements. Additionally, we design an adaptive network architecture which supports multi-level fusion and has the ability to determine the optimal fusion level. Moreover, we incorporate a robust attention module within the fusion network to exploit the synergy of radar and camera information. Evaluated with the canonical nuScenes dataset, our proposed method consistently and significantly outperforms the image-only version under all scenarios, especially in nightly and rainy conditions.","PeriodicalId":346917,"journal":{"name":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","volume":"60 6","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Radar-Enhanced Image Fusion-based Object Detection for Autonomous Driving\",\"authors\":\"Yaqing Gu, Shiyu Meng, Kun Shi\",\"doi\":\"10.1109/ICSPCC55723.2022.9984358\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Accurate and robust object detection is imperative to the implementation of autonomous driving. In real-world scenarios, the effectiveness of image-based detectors is limited by low visibility or harsh circumstances. Owing to the immunity to environmental variability, millimeter-wave (mmWave) radar sensors are complementary to camera sensors, opening up the possibility of radar-camera fusion to improve object detection performance. In this paper, we construct a Radar-Enhanced image Fusion Network (REFNet) for 2D object detection in autonomous driving. Specifically, the radar data is projected onto the camera image plane to unify the data format of heterogeneous sensing modalities. To overcome the sparsity of radar point clouds, we devise an Uncertainty Radar Block (URB) to increase the density of radar points considering the azimuth uncertainty of radar measurements. Additionally, we design an adaptive network architecture which supports multi-level fusion and has the ability to determine the optimal fusion level. Moreover, we incorporate a robust attention module within the fusion network to exploit the synergy of radar and camera information. Evaluated with the canonical nuScenes dataset, our proposed method consistently and significantly outperforms the image-only version under all scenarios, especially in nightly and rainy conditions.\",\"PeriodicalId\":346917,\"journal\":{\"name\":\"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)\",\"volume\":\"60 6\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICSPCC55723.2022.9984358\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE International Conference on Signal Processing, Communications and Computing (ICSPCC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICSPCC55723.2022.9984358","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Radar-Enhanced Image Fusion-based Object Detection for Autonomous Driving
Accurate and robust object detection is imperative to the implementation of autonomous driving. In real-world scenarios, the effectiveness of image-based detectors is limited by low visibility or harsh circumstances. Owing to the immunity to environmental variability, millimeter-wave (mmWave) radar sensors are complementary to camera sensors, opening up the possibility of radar-camera fusion to improve object detection performance. In this paper, we construct a Radar-Enhanced image Fusion Network (REFNet) for 2D object detection in autonomous driving. Specifically, the radar data is projected onto the camera image plane to unify the data format of heterogeneous sensing modalities. To overcome the sparsity of radar point clouds, we devise an Uncertainty Radar Block (URB) to increase the density of radar points considering the azimuth uncertainty of radar measurements. Additionally, we design an adaptive network architecture which supports multi-level fusion and has the ability to determine the optimal fusion level. Moreover, we incorporate a robust attention module within the fusion network to exploit the synergy of radar and camera information. Evaluated with the canonical nuScenes dataset, our proposed method consistently and significantly outperforms the image-only version under all scenarios, especially in nightly and rainy conditions.