{"title":"基于特征的旋转激光雷达低延迟定位","authors":"Lukas Beer;Thorsten Luettel;Mirko Maehlisch","doi":"10.1109/JISPIN.2025.3562512","DOIUrl":null,"url":null,"abstract":"An accurate global position is often considered to be one of the main requirements for autonomous driving. Even though GNSS provides a solution, it is dependent on the environment and not accurate enough. In this article, we present a fully GNSS-free localization, which uses maps and LiDAR to estimate the position of the vehicle. We tackle two major drawbacks of LiDAR-based localization: the limitation to the mapped area and a generally high latency. We use two different maps: a high-precision geometric HD map and a more general semantic occupancy grid map, resulting from OpenStreetMap. This allows us to provide a high-precision localization within the mapped area and a rough position estimate outside the mapped area. The coupling ensures seamless transitions when leaving or entering the HD map area, without losing the position and without the need for GNSS or loop closures. The latency is minimized by employing a continuous feature extraction. Instead of waiting for the full 360<inline-formula><tex-math>$^\\circ$</tex-math></inline-formula> rotation of the LiDAR, we extract semantic features during the rotation by combining a continuous instance and semantic segmentation. This reduces the latency to a minimum. We evaluate our approach in real-world experiments and show that it can localize the vehicle with a mean absolute error of 0.12 m using a full rotation of the LiDAR sensor, and 0.17 m with the continuous processing pipeline.","PeriodicalId":100621,"journal":{"name":"IEEE Journal of Indoor and Seamless Positioning and Navigation","volume":"3 ","pages":"105-116"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10970261","citationCount":"0","resultStr":"{\"title\":\"Toward Feature-Based Low-Latency Localization With Rotating LiDARs\",\"authors\":\"Lukas Beer;Thorsten Luettel;Mirko Maehlisch\",\"doi\":\"10.1109/JISPIN.2025.3562512\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An accurate global position is often considered to be one of the main requirements for autonomous driving. Even though GNSS provides a solution, it is dependent on the environment and not accurate enough. In this article, we present a fully GNSS-free localization, which uses maps and LiDAR to estimate the position of the vehicle. We tackle two major drawbacks of LiDAR-based localization: the limitation to the mapped area and a generally high latency. We use two different maps: a high-precision geometric HD map and a more general semantic occupancy grid map, resulting from OpenStreetMap. This allows us to provide a high-precision localization within the mapped area and a rough position estimate outside the mapped area. The coupling ensures seamless transitions when leaving or entering the HD map area, without losing the position and without the need for GNSS or loop closures. The latency is minimized by employing a continuous feature extraction. Instead of waiting for the full 360<inline-formula><tex-math>$^\\\\circ$</tex-math></inline-formula> rotation of the LiDAR, we extract semantic features during the rotation by combining a continuous instance and semantic segmentation. This reduces the latency to a minimum. We evaluate our approach in real-world experiments and show that it can localize the vehicle with a mean absolute error of 0.12 m using a full rotation of the LiDAR sensor, and 0.17 m with the continuous processing pipeline.\",\"PeriodicalId\":100621,\"journal\":{\"name\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"volume\":\"3 \",\"pages\":\"105-116\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-04-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10970261\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Journal of Indoor and Seamless Positioning and Navigation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10970261/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Journal of Indoor and Seamless Positioning and Navigation","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10970261/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Toward Feature-Based Low-Latency Localization With Rotating LiDARs
An accurate global position is often considered to be one of the main requirements for autonomous driving. Even though GNSS provides a solution, it is dependent on the environment and not accurate enough. In this article, we present a fully GNSS-free localization, which uses maps and LiDAR to estimate the position of the vehicle. We tackle two major drawbacks of LiDAR-based localization: the limitation to the mapped area and a generally high latency. We use two different maps: a high-precision geometric HD map and a more general semantic occupancy grid map, resulting from OpenStreetMap. This allows us to provide a high-precision localization within the mapped area and a rough position estimate outside the mapped area. The coupling ensures seamless transitions when leaving or entering the HD map area, without losing the position and without the need for GNSS or loop closures. The latency is minimized by employing a continuous feature extraction. Instead of waiting for the full 360$^\circ$ rotation of the LiDAR, we extract semantic features during the rotation by combining a continuous instance and semantic segmentation. This reduces the latency to a minimum. We evaluate our approach in real-world experiments and show that it can localize the vehicle with a mean absolute error of 0.12 m using a full rotation of the LiDAR sensor, and 0.17 m with the continuous processing pipeline.