Geryon: Edge Assisted Real-time and Robust Object Detection on Drones via mmWave Radar and Camera Fusion

Kaikai Deng, Dong Zhao, Qiaoyue Han, Shuyue Wang, Zihan Zhang, Anfu Zhou, Huadong Ma
{"title":"Geryon: Edge Assisted Real-time and Robust Object Detection on Drones via mmWave Radar and Camera Fusion","authors":"Kaikai Deng, Dong Zhao, Qiaoyue Han, Shuyue Wang, Zihan Zhang, Anfu Zhou, Huadong Ma","doi":"10.1145/3550298","DOIUrl":null,"url":null,"abstract":"Vision-based drone-view object detection suffers from severe performance degradation under adverse conditions (e.g., foggy weather, poor illumination). To remedy this, leveraging complementary mmWave radar has become a trend. However, existing fusion approaches seldom apply to drones due to i) the aggravated sparsity and noise of point clouds from low-cost commodity radars, and ii) explosive sensing data and intensive computations leading to high latency. To address these issues, we design Geryon , an edge assisted object detection system on drones, which utilizes a suit of approaches to fully exploit the complementary advantages of camera and mmWave radar on three levels: (i) a novel multi-frame compositing approach utilizes camera to assist radar to address the aggravated sparsity and noise of radar point clouds; (ii) a saliency area extraction and encoding approach utilizes radar to assist camera to reduce the bandwidth consumption and offloading latency; (iii) a parallel transmission and inference approach with a lightweight box enhancement scheme further reduces the offloading latency while ensuring the edge-side accuracy-latency trade-off by the parallelism and better camera-radar fusion. We implement and evaluate Geryon with four datasets we collect under foggy/rainy/snowy weather and poor illumination conditions, demonstrating its great advantages over other state-of-the-art approaches in terms of both accuracy and latency. CCS Concepts:","PeriodicalId":20463,"journal":{"name":"Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proc. ACM Interact. Mob. Wearable Ubiquitous Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3550298","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

Vision-based drone-view object detection suffers from severe performance degradation under adverse conditions (e.g., foggy weather, poor illumination). To remedy this, leveraging complementary mmWave radar has become a trend. However, existing fusion approaches seldom apply to drones due to i) the aggravated sparsity and noise of point clouds from low-cost commodity radars, and ii) explosive sensing data and intensive computations leading to high latency. To address these issues, we design Geryon , an edge assisted object detection system on drones, which utilizes a suit of approaches to fully exploit the complementary advantages of camera and mmWave radar on three levels: (i) a novel multi-frame compositing approach utilizes camera to assist radar to address the aggravated sparsity and noise of radar point clouds; (ii) a saliency area extraction and encoding approach utilizes radar to assist camera to reduce the bandwidth consumption and offloading latency; (iii) a parallel transmission and inference approach with a lightweight box enhancement scheme further reduces the offloading latency while ensuring the edge-side accuracy-latency trade-off by the parallelism and better camera-radar fusion. We implement and evaluate Geryon with four datasets we collect under foggy/rainy/snowy weather and poor illumination conditions, demonstrating its great advantages over other state-of-the-art approaches in terms of both accuracy and latency. CCS Concepts:
边缘辅助实时和鲁棒目标检测无人机通过毫米波雷达和相机融合
基于视觉的无人机视角目标检测在不利条件下(例如,雾天,光照不足)遭受严重的性能下降。为了解决这个问题,利用互补毫米波雷达已成为一种趋势。然而,现有的融合方法很少适用于无人机,因为1)来自低成本商用雷达的点云的稀疏性和噪声加剧,2)爆炸传感数据和密集的计算导致高延迟。为了解决这些问题,我们设计了无人机边缘辅助目标检测系统Geryon,该系统利用一系列方法在三个层面上充分利用相机和毫米波雷达的互补优势:(i)一种新颖的多帧合成方法利用相机辅助雷达解决雷达点云的稀疏性和噪声加剧;(ii)显著区域提取和编码方法利用雷达辅助相机,以减少带宽消耗和卸载延迟;(iii)采用轻量级盒增强方案的并行传输和推理方法进一步减少卸载延迟,同时通过并行性和更好的相机-雷达融合确保边缘精度-延迟权衡。我们使用在雾/雨/雪天气和恶劣光照条件下收集的四个数据集来实施和评估Geryon,证明其在准确性和延迟方面优于其他最先进的方法。CCS的概念:
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信