{"title":"VIRAA-SLAM: Flexible Robust Visual-Inertial-Range-AOA Tightly-Coupled Localization","authors":"Xingyu Ma;Ningyan Guo;Rui Xin;Zhigang Cen;Zhiyong Feng","doi":"10.1109/LRA.2025.3606384","DOIUrl":null,"url":null,"abstract":"In this letter, we propose a novel tightly-coupled fusion framework for robust and accurate long-term localization in fast-motion scenarios, integrating a monocular camera, a 6-DoF inertial measurement unit (IMU), and multiple position-unknown ultra-wideband (UWB) anchors. Unlike existing UWB fusion methods that rely on pre-calibrated anchors' positions, our approach leverages the relative UWB-derived angle and ranging measurements to constrain relative frame-to-frame relationships within a sliding window. These constraints are converted into priors through marginalization, significantly simplifying system complexity and the fusion process. Crucially, our method eliminates the need for the anchors' location estimations, supports an arbitrary number of anchors, and maintains robustness even under prolonged visual degradation. Experimental validation includes a challenging scenario where visual data is discarded between 15–60 seconds, demonstrating sustained operation without vision. Accuracy evaluations confirm that our method achieves superior performance compared to VINS-Mono, highlighting its precision and resilience in dynamic environments.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 10","pages":"10658-10665"},"PeriodicalIF":5.3000,"publicationDate":"2025-09-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11151655/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
In this letter, we propose a novel tightly-coupled fusion framework for robust and accurate long-term localization in fast-motion scenarios, integrating a monocular camera, a 6-DoF inertial measurement unit (IMU), and multiple position-unknown ultra-wideband (UWB) anchors. Unlike existing UWB fusion methods that rely on pre-calibrated anchors' positions, our approach leverages the relative UWB-derived angle and ranging measurements to constrain relative frame-to-frame relationships within a sliding window. These constraints are converted into priors through marginalization, significantly simplifying system complexity and the fusion process. Crucially, our method eliminates the need for the anchors' location estimations, supports an arbitrary number of anchors, and maintains robustness even under prolonged visual degradation. Experimental validation includes a challenging scenario where visual data is discarded between 15–60 seconds, demonstrating sustained operation without vision. Accuracy evaluations confirm that our method achieves superior performance compared to VINS-Mono, highlighting its precision and resilience in dynamic environments.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.