{"title":"Deep Neural Network for Visual Localization of Autonomous Car in ITS Campus Environment","authors":"Rudy Dikairono, Hendra Kusuma, Arnold Prajna","doi":"10.12962/jaree.v7i2.365","DOIUrl":null,"url":null,"abstract":"Intelligent Car (I-Car) ITS is an autonomous car prototype where one of the main localization methods is obtained through reading GPS data. However the accuracy of GPS readings is influenced by the availability of the information from GPS satellites, in which it often depends on the conditions of the place at that time, such as weather or atmospheric conditions, signal blockage, and density of a land. In this paper we propose the solution to overcome the unavailability of GPS localization information based on the omnidirectional camera visual data through environmental recognition around the ITS campus using Deep Neural Network. The process of recognition is to take GPS coordinate data to be used as an output reference point when the omnidirectional camera takes images of the surrounding environment. Visual localization trials were carried out in the ITS environment with a total of 200 GPS coordinates, where each GPS coordinate represents one class so that there are 200 classes for classification. Each coordinate/class has 96 training images. This condition is achieved for a vehicle speed of 20 km/h, with an image acquisition speed of 30 fps from the omnidirectional camera. By using AlexNet architecture, the result of visual localization accuracy is 49-54%. The test results were obtained by using a learning rate parameter of 0.00001, data augmentation, and the Drop Out technique to prevent overfitting and improve accuracy stability.","PeriodicalId":32708,"journal":{"name":"JAREE Journal on Advanced Research in Electrical Engineering","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"JAREE Journal on Advanced Research in Electrical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.12962/jaree.v7i2.365","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Intelligent Car (I-Car) ITS is an autonomous car prototype where one of the main localization methods is obtained through reading GPS data. However the accuracy of GPS readings is influenced by the availability of the information from GPS satellites, in which it often depends on the conditions of the place at that time, such as weather or atmospheric conditions, signal blockage, and density of a land. In this paper we propose the solution to overcome the unavailability of GPS localization information based on the omnidirectional camera visual data through environmental recognition around the ITS campus using Deep Neural Network. The process of recognition is to take GPS coordinate data to be used as an output reference point when the omnidirectional camera takes images of the surrounding environment. Visual localization trials were carried out in the ITS environment with a total of 200 GPS coordinates, where each GPS coordinate represents one class so that there are 200 classes for classification. Each coordinate/class has 96 training images. This condition is achieved for a vehicle speed of 20 km/h, with an image acquisition speed of 30 fps from the omnidirectional camera. By using AlexNet architecture, the result of visual localization accuracy is 49-54%. The test results were obtained by using a learning rate parameter of 0.00001, data augmentation, and the Drop Out technique to prevent overfitting and improve accuracy stability.