{"title":"Accurate vehicle self-localization in high definition map dataset","authors":"Andi Zang, Zichen Li, D. Doria, Goce Trajcevski","doi":"10.1145/3149092.3149094","DOIUrl":null,"url":null,"abstract":"One of the biggest challenges in automated driving is the ability to determine the vehicleâĂŹs location in realtime - a process known as self-localization or ego-localization. An automated driving system must be reliable under harsh conditions and environmental uncertainties (e.g. GPS denial or imprecision), sensor malfunction, road occlusions, poor lighting, and inclement weather. To cope with this myriad of potential problems, systems typically consist of a GPS receiver, in-vehicle sensors (e.g. cameras and LiDAR devices), and 3D High-Definition (3D HD) Maps. In this paper, we review state-of-the-art self-localization techniques, and present a benchmark for the task of image-based vehicle self-localization. Our dataset was collected on 10km of the Warren Freeway in the San Francisco Area under reasonable traffic and weather conditions. As input to the localization process, we provide timestamp-synchronized, consumer-grade monocular video frames (with camera intrinsic parameters), consumer-grade GPS trajectory, and production-grade 3D HD Maps. For evaluation, we provide survey-grade GPS trajectory. The goal of this dataset is to standardize and formalize the challenge of accurate vehicle self-localization and provide a benchmark to develop and evaluate algorithms.","PeriodicalId":257253,"journal":{"name":"Proceedings of the 1st ACM SIGSPATIAL Workshop on High-Precision Maps and Intelligent Applications for Autonomous Vehicles","volume":"17 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"25","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 1st ACM SIGSPATIAL Workshop on High-Precision Maps and Intelligent Applications for Autonomous Vehicles","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3149092.3149094","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 25
Abstract
One of the biggest challenges in automated driving is the ability to determine the vehicleâĂŹs location in realtime - a process known as self-localization or ego-localization. An automated driving system must be reliable under harsh conditions and environmental uncertainties (e.g. GPS denial or imprecision), sensor malfunction, road occlusions, poor lighting, and inclement weather. To cope with this myriad of potential problems, systems typically consist of a GPS receiver, in-vehicle sensors (e.g. cameras and LiDAR devices), and 3D High-Definition (3D HD) Maps. In this paper, we review state-of-the-art self-localization techniques, and present a benchmark for the task of image-based vehicle self-localization. Our dataset was collected on 10km of the Warren Freeway in the San Francisco Area under reasonable traffic and weather conditions. As input to the localization process, we provide timestamp-synchronized, consumer-grade monocular video frames (with camera intrinsic parameters), consumer-grade GPS trajectory, and production-grade 3D HD Maps. For evaluation, we provide survey-grade GPS trajectory. The goal of this dataset is to standardize and formalize the challenge of accurate vehicle self-localization and provide a benchmark to develop and evaluate algorithms.