{"title":"基于边缘粒子滤波的视惯性传感器实时融合","authors":"G. Bleser, D. Stricker","doi":"10.1109/ISMAR.2008.4637316","DOIUrl":null,"url":null,"abstract":"The use of a particle filter (PF) for camera pose estimation is an ongoing topic in the robotics and computer vision community, especially since the FastSLAM algorithm has been utilised for simultaneous localisation and mapping (SLAM) applications with a single camera. The major problem in this context consists in the poor proposal distribution of the camera pose particles obtained from the weak motion model of a camera moved freely in 3D space. While the FastSLAM 2.0 extension is one possibility to improve the proposal distribution, this paper addresses the question of how to use measurements from low-cost inertial sensors (gyroscopes and accelerometers) to compensate for the missing control information. However, the integration of inertial data requires the additional estimation of sensor biases, velocities and potentially accelerations, resulting in a state dimension, which is not manageable by a standard PF. Therefore, the contribution of this paper consists in developing a real-time capable sensor fusion strategy based upon the marginalised particle filter (MPF) framework. The performance of the proposed strategy is evaluated in combination with a marker-based tracking system and results from a comparison with previous visual-inertial fusion strategies based upon the extended Kalman filter (EKF), the standard PF and the MPF are presented.","PeriodicalId":168134,"journal":{"name":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","volume":"29 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-09-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"18","resultStr":"{\"title\":\"Using the marginalised particle filter for real-time visual-inertial sensor fusion\",\"authors\":\"G. Bleser, D. Stricker\",\"doi\":\"10.1109/ISMAR.2008.4637316\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of a particle filter (PF) for camera pose estimation is an ongoing topic in the robotics and computer vision community, especially since the FastSLAM algorithm has been utilised for simultaneous localisation and mapping (SLAM) applications with a single camera. The major problem in this context consists in the poor proposal distribution of the camera pose particles obtained from the weak motion model of a camera moved freely in 3D space. While the FastSLAM 2.0 extension is one possibility to improve the proposal distribution, this paper addresses the question of how to use measurements from low-cost inertial sensors (gyroscopes and accelerometers) to compensate for the missing control information. However, the integration of inertial data requires the additional estimation of sensor biases, velocities and potentially accelerations, resulting in a state dimension, which is not manageable by a standard PF. Therefore, the contribution of this paper consists in developing a real-time capable sensor fusion strategy based upon the marginalised particle filter (MPF) framework. The performance of the proposed strategy is evaluated in combination with a marker-based tracking system and results from a comparison with previous visual-inertial fusion strategies based upon the extended Kalman filter (EKF), the standard PF and the MPF are presented.\",\"PeriodicalId\":168134,\"journal\":{\"name\":\"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality\",\"volume\":\"29 4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-09-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"18\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ISMAR.2008.4637316\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 7th IEEE/ACM International Symposium on Mixed and Augmented Reality","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMAR.2008.4637316","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using the marginalised particle filter for real-time visual-inertial sensor fusion
The use of a particle filter (PF) for camera pose estimation is an ongoing topic in the robotics and computer vision community, especially since the FastSLAM algorithm has been utilised for simultaneous localisation and mapping (SLAM) applications with a single camera. The major problem in this context consists in the poor proposal distribution of the camera pose particles obtained from the weak motion model of a camera moved freely in 3D space. While the FastSLAM 2.0 extension is one possibility to improve the proposal distribution, this paper addresses the question of how to use measurements from low-cost inertial sensors (gyroscopes and accelerometers) to compensate for the missing control information. However, the integration of inertial data requires the additional estimation of sensor biases, velocities and potentially accelerations, resulting in a state dimension, which is not manageable by a standard PF. Therefore, the contribution of this paper consists in developing a real-time capable sensor fusion strategy based upon the marginalised particle filter (MPF) framework. The performance of the proposed strategy is evaluated in combination with a marker-based tracking system and results from a comparison with previous visual-inertial fusion strategies based upon the extended Kalman filter (EKF), the standard PF and the MPF are presented.