Tan-Thien-Nien Nguyen, Thanh-Danh Phan, Minh-Thien Duong, Chi-Tam Nguyen, Hong-Phong Ly, M. Le
{"title":"Sensor Fusion of Camera and 2D LiDAR for Self-Driving Automobile in Obstacle Avoidance Scenarios","authors":"Tan-Thien-Nien Nguyen, Thanh-Danh Phan, Minh-Thien Duong, Chi-Tam Nguyen, Hong-Phong Ly, M. Le","doi":"10.1109/IWIS56333.2022.9920917","DOIUrl":null,"url":null,"abstract":"Obstacle dodging and overtaking are the pivotal tasks ensuring safety for self-driving automobiles. Multi-sensors fusion is the must-required condition to explore the entire surrounding information. This paper proposes a novel frontal dynamic car dodging strategy for automobiles with the left-hand side steering wheel by fusion of a camera and 2D LiDAR features. To begin with, we improve the LiteSeg model to extract the segmented mask, which can determine the drivable area and the avoiding direction. In addition to a camera, 2D LiDAR is used to check the scene information on the right side, which the camera's range cannot cover. As for point clouds output of 2D LiDAR, we adopt the Adaptive Breakpoint Detection (defined as ABD) algorithm to cluster the objects in a scanning plane. Subsequently, the RANSAC algorithm forms a straight line from the clustered point clouds to determine the boundary of the right-side obstacle. Besides, we compute the distance from LiDAR to the estimated straight line to maintain a safe distance when overtaking. Last but not least, the post-processing results of the two devices are fused to decide on the obstacle dodging and overtaking. The comprehensive experiments reveal that our self-driving automobile could perform well on the university campus in diverse scenarios.","PeriodicalId":340399,"journal":{"name":"2022 International Workshop on Intelligent Systems (IWIS)","volume":"221 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Workshop on Intelligent Systems (IWIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWIS56333.2022.9920917","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Obstacle dodging and overtaking are the pivotal tasks ensuring safety for self-driving automobiles. Multi-sensors fusion is the must-required condition to explore the entire surrounding information. This paper proposes a novel frontal dynamic car dodging strategy for automobiles with the left-hand side steering wheel by fusion of a camera and 2D LiDAR features. To begin with, we improve the LiteSeg model to extract the segmented mask, which can determine the drivable area and the avoiding direction. In addition to a camera, 2D LiDAR is used to check the scene information on the right side, which the camera's range cannot cover. As for point clouds output of 2D LiDAR, we adopt the Adaptive Breakpoint Detection (defined as ABD) algorithm to cluster the objects in a scanning plane. Subsequently, the RANSAC algorithm forms a straight line from the clustered point clouds to determine the boundary of the right-side obstacle. Besides, we compute the distance from LiDAR to the estimated straight line to maintain a safe distance when overtaking. Last but not least, the post-processing results of the two devices are fused to decide on the obstacle dodging and overtaking. The comprehensive experiments reveal that our self-driving automobile could perform well on the university campus in diverse scenarios.