Bhakti Y. Suprapto;Suci Dwijayanti;Dimsyiar M.A. Hafiz;Farhan A. Ardandy;Javen Jonathan
{"title":"Designing an Autonomous Vehicle Using Sensor Fusion Based on Path Planning and Deep Learning Algorithms","authors":"Bhakti Y. Suprapto;Suci Dwijayanti;Dimsyiar M.A. Hafiz;Farhan A. Ardandy;Javen Jonathan","doi":"10.23919/SAIEE.2024.10551314","DOIUrl":null,"url":null,"abstract":"Autonomous electric vehicles use camera sensors for vision-based steering control and detecting both roads and objects. In this study, road and object detection are combined, utilizing the YOLOv8x-seg model trained for 200 epochs, achieving the lowest segmentation loss at 0.53182. Simulation tests demonstrate accurate road and object detection, effective object distance measurement, and real-time road identification for steering control, successfully keeping the vehicle on track with an average object distance measurement error of2.245 m. Route planning for autonomous vehicles is crucial, and the A-Star algorithm is employed to find the optimal route. In real-time tests, when an obstacle is placed between nodes 6 and 7, the A-Star algorithm can reroute from the original path (5, 6, 7, 27, and 28) to a new path (5, 6, 9, 27, and 28). This study demonstrates the vital role of sensor fusion in autonomous vehicles by integrating various sensors. This study focuses on sensor fusion for object-road detection and path planning using the A\n<sup>*</sup>\n algorithm. Real-time tests in two different scenarios demonstrate the successful integration of sensor fusion, enabling the vehicle to follow planned routes. However, some route nodes remain unreachable, requiring occasional driver intervention. These results demonstrate the feasibility of sensor fusion with diverse tasks in third-level autonomous vehicles.","PeriodicalId":42493,"journal":{"name":"SAIEE Africa Research Journal","volume":null,"pages":null},"PeriodicalIF":1.0000,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10551314","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"SAIEE Africa Research Journal","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10551314/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
Abstract
Autonomous electric vehicles use camera sensors for vision-based steering control and detecting both roads and objects. In this study, road and object detection are combined, utilizing the YOLOv8x-seg model trained for 200 epochs, achieving the lowest segmentation loss at 0.53182. Simulation tests demonstrate accurate road and object detection, effective object distance measurement, and real-time road identification for steering control, successfully keeping the vehicle on track with an average object distance measurement error of2.245 m. Route planning for autonomous vehicles is crucial, and the A-Star algorithm is employed to find the optimal route. In real-time tests, when an obstacle is placed between nodes 6 and 7, the A-Star algorithm can reroute from the original path (5, 6, 7, 27, and 28) to a new path (5, 6, 9, 27, and 28). This study demonstrates the vital role of sensor fusion in autonomous vehicles by integrating various sensors. This study focuses on sensor fusion for object-road detection and path planning using the A
*
algorithm. Real-time tests in two different scenarios demonstrate the successful integration of sensor fusion, enabling the vehicle to follow planned routes. However, some route nodes remain unreachable, requiring occasional driver intervention. These results demonstrate the feasibility of sensor fusion with diverse tasks in third-level autonomous vehicles.