一种改进的紧耦合单目视觉惯性定位算法

Yiliang Wu, Minghong Chen, Yendo Hu, Xue Bai, Zhipeng Shi, Guanglei Zhuo
{"title":"一种改进的紧耦合单目视觉惯性定位算法","authors":"Yiliang Wu, Minghong Chen, Yendo Hu, Xue Bai, Zhipeng Shi, Guanglei Zhuo","doi":"10.1145/3501409.3501694","DOIUrl":null,"url":null,"abstract":"Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.","PeriodicalId":191106,"journal":{"name":"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering","volume":"33 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"An Improved Tightly-Coupled Monocular Visual-Inertial Location Algorithm\",\"authors\":\"Yiliang Wu, Minghong Chen, Yendo Hu, Xue Bai, Zhipeng Shi, Guanglei Zhuo\",\"doi\":\"10.1145/3501409.3501694\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.\",\"PeriodicalId\":191106,\"journal\":{\"name\":\"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering\",\"volume\":\"33 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3501409.3501694\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2021 5th International Conference on Electronic Information Technology and Computer Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3501409.3501694","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

直接和光流视觉里程表都是基于一个强烈的假设,即灰度不变。由于这种假设,系统对图像的亮度变化非常敏感。为了解决这一问题,本文提出了利用多水平定向快速旋转(ORB)特性和紧密耦合融合策略的单目视觉惯性里程计。该方法的目的是提高匹配的速度和鲁棒性。此外,它还可以建立一个高精度的初始化映射,以确保整个系统的初始化成功和后续操作的顺利进行。通过对惯性测量单元(IMU)数据进行预集成,并利用视觉投影构造约束条件,对解进行优化。在公共数据集上进行的实验表明,该算法的多级ORB特征和IMU融合使其比其他优秀的方案具有更高的精度。与使用视觉和惯性融合的OKVIS和VINS-Mono相比,该算法在状态估计精度和系统鲁棒性方面也有更好的表现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
An Improved Tightly-Coupled Monocular Visual-Inertial Location Algorithm
Both direct and optical flow visual odometers are based on a strong assumption that the gray scale is invariant. Because of this assumption, the system is sensitive to the luminosity change of the image. To solve this problem, this paper proposes monocular visual-inertial odometry exploiting both multilevel Oriented Fast and Rotated Brief (ORB) feature and tightly-coupled fusion strategy. The purpose of this method is to improve the speed and robustness of matching. Furthermore, it also can build a high-precision initialization map to ensure the successful initialization of the whole system and the smooth operation of the subsequent. This paper pre-integrates Inertial Measurement Unit (IMU) data and constructs constraints with visual reprojection to optimize the solution. The experiments evaluated on public datasets demonstrate the multilevel ORB feature and the IMU fusion make the algorithm more accurate than other excellent projects. Compared with OKVIS and VINS-Mono using the visual and inertial fusion, the algorithm also performs better in the accuracy of state estimation and system robustness.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信