M. Sorial, Issa Mouawad, E. Simetti, F. Odone, G. Casalino
{"title":"Towards a Real Time Obstacle Detection System for Unmanned Surface Vehicles","authors":"M. Sorial, Issa Mouawad, E. Simetti, F. Odone, G. Casalino","doi":"10.23919/OCEANS40490.2019.8962685","DOIUrl":null,"url":null,"abstract":"Environment perception plays a vital role in autonomous systems. Different sensory channels are exploited to capture a rich and a dynamic grasp of the surrounding to enable safe and efficient navigation. In this work, scene understanding in maritime environments is tackled using an obstacle detection and tracking procedure based on a video camera and a LIDAR (range sensor). We rely on state-of-the-art object detection CNN (Convolutional neural network) YOLO to detect obstacles in the image plane and we propose an efficient object tracking procedure to track objects across frames. LIDAR point clouds are used to locate obstacles in the world reference frame and thus, to derive a scene map with the obstacles and their headings which is used as an input to a path planning algorithm. the procedure is tested using data acquired from an unmanned surface vessel and typical metrics in object detection and tracking communities are used to evaluate our method. The results demonstrate improvement we achieve over the object detector baseline and other similar online tracking methods.","PeriodicalId":208102,"journal":{"name":"OCEANS 2019 MTS/IEEE SEATTLE","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"OCEANS 2019 MTS/IEEE SEATTLE","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.23919/OCEANS40490.2019.8962685","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Environment perception plays a vital role in autonomous systems. Different sensory channels are exploited to capture a rich and a dynamic grasp of the surrounding to enable safe and efficient navigation. In this work, scene understanding in maritime environments is tackled using an obstacle detection and tracking procedure based on a video camera and a LIDAR (range sensor). We rely on state-of-the-art object detection CNN (Convolutional neural network) YOLO to detect obstacles in the image plane and we propose an efficient object tracking procedure to track objects across frames. LIDAR point clouds are used to locate obstacles in the world reference frame and thus, to derive a scene map with the obstacles and their headings which is used as an input to a path planning algorithm. the procedure is tested using data acquired from an unmanned surface vessel and typical metrics in object detection and tracking communities are used to evaluate our method. The results demonstrate improvement we achieve over the object detector baseline and other similar online tracking methods.