Jiqian Dong, Qi Chen, Deyuan Qu, Hongsheng Lu, Akila Ganlath, Qing Yang, Sikai Chen, S. Labi
{"title":"基于激光雷达的协同相对定位","authors":"Jiqian Dong, Qi Chen, Deyuan Qu, Hongsheng Lu, Akila Ganlath, Qing Yang, Sikai Chen, S. Labi","doi":"10.1109/IV55152.2023.10186549","DOIUrl":null,"url":null,"abstract":"Vehicular cooperative perception aims to provide connected and automated vehicles (CAVs) with a longer and wider sensing range, making perception less susceptible to occlusions. However, this prospect is dimmed by the imperfection of onboard localization sensors such as Global Navigation Satellite Systems (GNSS), which can cause errors in aligning over-the-air perception data (from a remote vehicle) with a Host vehicle’s (HV’s) local observation. To mitigate this challenge, we propose a novel LiDAR-based relative localization framework based on the iterative closest point (ICP) algorithm. The framework seeks to estimate the correct transformation matrix between a pair of CAVs’ coordinate systems, through exchanging and matching a limited yet carefully chosen set of point clouds and usage of a coarse 2D map. From the deployment perspective, this means our framework only consumes conservative bandwidth in data transmission and can run efficiently with limited resources. Extensive evaluations on both synthetic dataset (COMAP) and KITTI-360 show that our proposed framework achieves state-of-the-art (SOTA) performance in cooperative localization. Therefore, it can be integrated with any upper-stream data fusion algorithm and serves as a preprocessor for high-quality cooperative perception.","PeriodicalId":195148,"journal":{"name":"2023 IEEE Intelligent Vehicles Symposium (IV)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"LiDAR-based Cooperative Relative Localization\",\"authors\":\"Jiqian Dong, Qi Chen, Deyuan Qu, Hongsheng Lu, Akila Ganlath, Qing Yang, Sikai Chen, S. Labi\",\"doi\":\"10.1109/IV55152.2023.10186549\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Vehicular cooperative perception aims to provide connected and automated vehicles (CAVs) with a longer and wider sensing range, making perception less susceptible to occlusions. However, this prospect is dimmed by the imperfection of onboard localization sensors such as Global Navigation Satellite Systems (GNSS), which can cause errors in aligning over-the-air perception data (from a remote vehicle) with a Host vehicle’s (HV’s) local observation. To mitigate this challenge, we propose a novel LiDAR-based relative localization framework based on the iterative closest point (ICP) algorithm. The framework seeks to estimate the correct transformation matrix between a pair of CAVs’ coordinate systems, through exchanging and matching a limited yet carefully chosen set of point clouds and usage of a coarse 2D map. From the deployment perspective, this means our framework only consumes conservative bandwidth in data transmission and can run efficiently with limited resources. Extensive evaluations on both synthetic dataset (COMAP) and KITTI-360 show that our proposed framework achieves state-of-the-art (SOTA) performance in cooperative localization. Therefore, it can be integrated with any upper-stream data fusion algorithm and serves as a preprocessor for high-quality cooperative perception.\",\"PeriodicalId\":195148,\"journal\":{\"name\":\"2023 IEEE Intelligent Vehicles Symposium (IV)\",\"volume\":\"8 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Intelligent Vehicles Symposium (IV)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IV55152.2023.10186549\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Intelligent Vehicles Symposium (IV)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IV55152.2023.10186549","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Vehicular cooperative perception aims to provide connected and automated vehicles (CAVs) with a longer and wider sensing range, making perception less susceptible to occlusions. However, this prospect is dimmed by the imperfection of onboard localization sensors such as Global Navigation Satellite Systems (GNSS), which can cause errors in aligning over-the-air perception data (from a remote vehicle) with a Host vehicle’s (HV’s) local observation. To mitigate this challenge, we propose a novel LiDAR-based relative localization framework based on the iterative closest point (ICP) algorithm. The framework seeks to estimate the correct transformation matrix between a pair of CAVs’ coordinate systems, through exchanging and matching a limited yet carefully chosen set of point clouds and usage of a coarse 2D map. From the deployment perspective, this means our framework only consumes conservative bandwidth in data transmission and can run efficiently with limited resources. Extensive evaluations on both synthetic dataset (COMAP) and KITTI-360 show that our proposed framework achieves state-of-the-art (SOTA) performance in cooperative localization. Therefore, it can be integrated with any upper-stream data fusion algorithm and serves as a preprocessor for high-quality cooperative perception.