Towards a Real Time Obstacle Detection System for Unmanned Surface Vehicles

M. Sorial, Issa Mouawad, E. Simetti, F. Odone, G. Casalino
{"title":"Towards a Real Time Obstacle Detection System for Unmanned Surface Vehicles","authors":"M. Sorial, Issa Mouawad, E. Simetti, F. Odone, G. Casalino","doi":"10.23919/OCEANS40490.2019.8962685","DOIUrl":null,"url":null,"abstract":"Environment perception plays a vital role in autonomous systems. Different sensory channels are exploited to capture a rich and a dynamic grasp of the surrounding to enable safe and efficient navigation. In this work, scene understanding in maritime environments is tackled using an obstacle detection and tracking procedure based on a video camera and a LIDAR (range sensor). We rely on state-of-the-art object detection CNN (Convolutional neural network) YOLO to detect obstacles in the image plane and we propose an efficient object tracking procedure to track objects across frames. LIDAR point clouds are used to locate obstacles in the world reference frame and thus, to derive a scene map with the obstacles and their headings which is used as an input to a path planning algorithm. the procedure is tested using data acquired from an unmanned surface vessel and typical metrics in object detection and tracking communities are used to evaluate our method. The results demonstrate improvement we achieve over the object detector baseline and other similar online tracking methods.","PeriodicalId":208102,"journal":{"name":"OCEANS 2019 MTS/IEEE SEATTLE","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"OCEANS 2019 MTS/IEEE SEATTLE","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/OCEANS40490.2019.8962685","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7

Abstract

Environment perception plays a vital role in autonomous systems. Different sensory channels are exploited to capture a rich and a dynamic grasp of the surrounding to enable safe and efficient navigation. In this work, scene understanding in maritime environments is tackled using an obstacle detection and tracking procedure based on a video camera and a LIDAR (range sensor). We rely on state-of-the-art object detection CNN (Convolutional neural network) YOLO to detect obstacles in the image plane and we propose an efficient object tracking procedure to track objects across frames. LIDAR point clouds are used to locate obstacles in the world reference frame and thus, to derive a scene map with the obstacles and their headings which is used as an input to a path planning algorithm. the procedure is tested using data acquired from an unmanned surface vessel and typical metrics in object detection and tracking communities are used to evaluate our method. The results demonstrate improvement we achieve over the object detector baseline and other similar online tracking methods.
面向地面无人驾驶车辆的实时障碍物检测系统研究
环境感知在自主系统中起着至关重要的作用。不同的感官通道被用来捕捉丰富和动态的周围环境,从而实现安全高效的导航。在这项工作中,使用基于摄像机和激光雷达(距离传感器)的障碍物检测和跟踪程序来处理海洋环境中的场景理解。我们依靠最先进的目标检测CNN(卷积神经网络)YOLO来检测图像平面上的障碍物,我们提出了一个有效的目标跟踪程序来跟踪跨帧的目标。激光雷达点云用于定位世界参考系中的障碍物,从而导出包含障碍物及其标题的场景地图,该地图用作路径规划算法的输入。该程序使用从无人水面舰艇获得的数据进行测试,并使用目标检测和跟踪社区的典型指标来评估我们的方法。结果表明,与目标检测器基线和其他类似的在线跟踪方法相比,我们取得了改进。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信