VIRAA-SLAM: Flexible Robust Visual-Inertial-Range-AOA Tightly-Coupled Localization

IF 5.3 2区 计算机科学 Q2 ROBOTICS
Xingyu Ma;Ningyan Guo;Rui Xin;Zhigang Cen;Zhiyong Feng
{"title":"VIRAA-SLAM: Flexible Robust Visual-Inertial-Range-AOA Tightly-Coupled Localization","authors":"Xingyu Ma;Ningyan Guo;Rui Xin;Zhigang Cen;Zhiyong Feng","doi":"10.1109/LRA.2025.3606384","DOIUrl":null,"url":null,"abstract":"In this letter, we propose a novel tightly-coupled fusion framework for robust and accurate long-term localization in fast-motion scenarios, integrating a monocular camera, a 6-DoF inertial measurement unit (IMU), and multiple position-unknown ultra-wideband (UWB) anchors. Unlike existing UWB fusion methods that rely on pre-calibrated anchors' positions, our approach leverages the relative UWB-derived angle and ranging measurements to constrain relative frame-to-frame relationships within a sliding window. These constraints are converted into priors through marginalization, significantly simplifying system complexity and the fusion process. Crucially, our method eliminates the need for the anchors' location estimations, supports an arbitrary number of anchors, and maintains robustness even under prolonged visual degradation. Experimental validation includes a challenging scenario where visual data is discarded between 15–60 seconds, demonstrating sustained operation without vision. Accuracy evaluations confirm that our method achieves superior performance compared to VINS-Mono, highlighting its precision and resilience in dynamic environments.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 10","pages":"10658-10665"},"PeriodicalIF":5.3000,"publicationDate":"2025-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11151655/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

In this letter, we propose a novel tightly-coupled fusion framework for robust and accurate long-term localization in fast-motion scenarios, integrating a monocular camera, a 6-DoF inertial measurement unit (IMU), and multiple position-unknown ultra-wideband (UWB) anchors. Unlike existing UWB fusion methods that rely on pre-calibrated anchors' positions, our approach leverages the relative UWB-derived angle and ranging measurements to constrain relative frame-to-frame relationships within a sliding window. These constraints are converted into priors through marginalization, significantly simplifying system complexity and the fusion process. Crucially, our method eliminates the need for the anchors' location estimations, supports an arbitrary number of anchors, and maintains robustness even under prolonged visual degradation. Experimental validation includes a challenging scenario where visual data is discarded between 15–60 seconds, demonstrating sustained operation without vision. Accuracy evaluations confirm that our method achieves superior performance compared to VINS-Mono, highlighting its precision and resilience in dynamic environments.
VIRAA-SLAM:柔性鲁棒视觉-惯性距离- aoa紧密耦合定位
在这封信中,我们提出了一种新的紧密耦合融合框架,用于在快速运动场景中实现鲁棒和精确的长期定位,该框架集成了单目摄像机、6自由度惯性测量单元(IMU)和多个位置未知超宽带(UWB)锚点。与现有的依赖于预校准锚点位置的超宽带融合方法不同,我们的方法利用超宽带衍生的相对角度和测距测量来约束滑动窗口内帧与帧之间的相对关系。这些约束通过边缘化转化为先验,极大地简化了系统复杂性和融合过程。至关重要的是,我们的方法消除了对锚点位置估计的需要,支持任意数量的锚点,并且即使在长时间的视觉退化下也保持鲁棒性。实验验证包括一个具有挑战性的场景,其中视觉数据在15-60秒之间被丢弃,演示在没有视觉的情况下持续操作。精度评估证实,与VINS-Mono相比,我们的方法实现了卓越的性能,突出了其在动态环境中的精度和弹性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Robotics and Automation Letters
IEEE Robotics and Automation Letters Computer Science-Computer Science Applications
CiteScore
9.60
自引率
15.40%
发文量
1428
期刊介绍: The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信