{"title":"基于通用优化的多传感器全局姿态估计框架","authors":"Tong Qin, Shaozu Cao, Jie Pan, Shaojie Shen","doi":"10.1049/csy2.70023","DOIUrl":null,"url":null,"abstract":"<p>Accurate state estimation is a fundamental problem for autonomous robots. To achieve locally accurate and globally drift-free state estimation, multiple sensors with complementary properties are usually fused together. Local sensors (camera, IMU (inertial measurement unit), LiDAR, etc.) provide precise poses within a small region, whereas global sensors (GPS (global positioning system), magnetometer, barometer, etc.) supply noisy but globally drift-free localisation in a large-scale environment. In this paper, we propose a sensor fusion framework to fuse local states with global sensors, which achieves locally accurate and globally drift-free pose estimation. Local estimations, produced by existing visual odometry/visual-inertial odometry (VO/VIO) approaches, are fused with global sensors in a pose graph optimisation. Within the graph optimisation, local estimations are aligned into a global coordinate. Meanwhile, the accumulated drifts are eliminated. We evaluated the performance of our system on public datasets and with real-world experiments. The results are compared with those of other state-of-the-art algorithms. We highlight that our system is a general framework which can easily fuse various global sensors in a unified pose graph optimisation.</p>","PeriodicalId":34110,"journal":{"name":"IET Cybersystems and Robotics","volume":"7 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2025-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/csy2.70023","citationCount":"0","resultStr":"{\"title\":\"A General Optimisation-Based Framework for Global Pose Estimation With Multiple Sensors\",\"authors\":\"Tong Qin, Shaozu Cao, Jie Pan, Shaojie Shen\",\"doi\":\"10.1049/csy2.70023\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Accurate state estimation is a fundamental problem for autonomous robots. To achieve locally accurate and globally drift-free state estimation, multiple sensors with complementary properties are usually fused together. Local sensors (camera, IMU (inertial measurement unit), LiDAR, etc.) provide precise poses within a small region, whereas global sensors (GPS (global positioning system), magnetometer, barometer, etc.) supply noisy but globally drift-free localisation in a large-scale environment. In this paper, we propose a sensor fusion framework to fuse local states with global sensors, which achieves locally accurate and globally drift-free pose estimation. Local estimations, produced by existing visual odometry/visual-inertial odometry (VO/VIO) approaches, are fused with global sensors in a pose graph optimisation. Within the graph optimisation, local estimations are aligned into a global coordinate. Meanwhile, the accumulated drifts are eliminated. We evaluated the performance of our system on public datasets and with real-world experiments. The results are compared with those of other state-of-the-art algorithms. We highlight that our system is a general framework which can easily fuse various global sensors in a unified pose graph optimisation.</p>\",\"PeriodicalId\":34110,\"journal\":{\"name\":\"IET Cybersystems and Robotics\",\"volume\":\"7 1\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2025-09-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ietresearch.onlinelibrary.wiley.com/doi/epdf/10.1049/csy2.70023\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IET Cybersystems and Robotics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/csy2.70023\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IET Cybersystems and Robotics","FirstCategoryId":"1085","ListUrlMain":"https://ietresearch.onlinelibrary.wiley.com/doi/10.1049/csy2.70023","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
A General Optimisation-Based Framework for Global Pose Estimation With Multiple Sensors
Accurate state estimation is a fundamental problem for autonomous robots. To achieve locally accurate and globally drift-free state estimation, multiple sensors with complementary properties are usually fused together. Local sensors (camera, IMU (inertial measurement unit), LiDAR, etc.) provide precise poses within a small region, whereas global sensors (GPS (global positioning system), magnetometer, barometer, etc.) supply noisy but globally drift-free localisation in a large-scale environment. In this paper, we propose a sensor fusion framework to fuse local states with global sensors, which achieves locally accurate and globally drift-free pose estimation. Local estimations, produced by existing visual odometry/visual-inertial odometry (VO/VIO) approaches, are fused with global sensors in a pose graph optimisation. Within the graph optimisation, local estimations are aligned into a global coordinate. Meanwhile, the accumulated drifts are eliminated. We evaluated the performance of our system on public datasets and with real-world experiments. The results are compared with those of other state-of-the-art algorithms. We highlight that our system is a general framework which can easily fuse various global sensors in a unified pose graph optimisation.