{"title":"一种改进的紧耦合单目视觉惯性定位算法","authors":"Yiliang Wu, Minghong Chen, Yendo Hu, Xue Bai, Zhipeng Shi, Guanglei Zhuo","doi":"10.1145/3501409.3501694","DOIUrl":null,"url":null,"abstract":"Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.","PeriodicalId":191106,"journal":{"name":"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Improved Tightly-Coupled Monocular Visual-Inertial Location Algorithm\",\"authors\":\"Yiliang Wu, Minghong Chen, Yendo Hu, Xue Bai, Zhipeng Shi, Guanglei Zhuo\",\"doi\":\"10.1145/3501409.3501694\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.\",\"PeriodicalId\":191106,\"journal\":{\"name\":\"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3501409.3501694\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3501409.3501694","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Improved Tightly-Coupled Monocular Visual-Inertial Location Algorithm
Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.