Ylenia Nisticò;João Carlos Virgolino Soares;Lorenzo Amatucci;Geoff Fink;Claudio Semini
{"title":"MUSE: A Real-Time Multi-Sensor State Estimator for Quadruped Robots","authors":"Ylenia Nisticò;João Carlos Virgolino Soares;Lorenzo Amatucci;Geoff Fink;Claudio Semini","doi":"10.1109/LRA.2025.3553047","DOIUrl":null,"url":null,"abstract":"This letter introduces an innovative state estimator, MUSE (MUlti-sensor State Estimator), designed to enhance state estimation's accuracy and real-time performance in quadruped robot navigation. The proposed state estimator builds upon our previous work presented in (Fink et al. 2020). It integrates data from a range of onboard sensors, including IMUs, encoders, cameras, and LiDARs, to deliver a comprehensive and reliable estimation of the robot's pose and motion, even in slippery scenarios. We tested MUSE on a Unitree Aliengo robot, successfully closing the locomotion control loop in difficult scenarios, including slippery and uneven terrain. Benchmarking against Pronto (Camurri et al. 2020) and VILENS (Wisth et al. 2022) showed 67.6% and 26.7% reductions in translational errors, respectively. Additionally, MUSE outperformed DLIO (Chen et al. 2023), a LiDAR-inertial odometry system in rotational errors and frequency, while the proprioceptive version of MUSE (P-MUSE) outperformed TSIF [Bloesch et al. 2018], with a 45.9% reduction in absolute trajectory error (ATE).","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 5","pages":"4620-4627"},"PeriodicalIF":4.6000,"publicationDate":"2025-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10933515/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
This letter introduces an innovative state estimator, MUSE (MUlti-sensor State Estimator), designed to enhance state estimation's accuracy and real-time performance in quadruped robot navigation. The proposed state estimator builds upon our previous work presented in (Fink et al. 2020). It integrates data from a range of onboard sensors, including IMUs, encoders, cameras, and LiDARs, to deliver a comprehensive and reliable estimation of the robot's pose and motion, even in slippery scenarios. We tested MUSE on a Unitree Aliengo robot, successfully closing the locomotion control loop in difficult scenarios, including slippery and uneven terrain. Benchmarking against Pronto (Camurri et al. 2020) and VILENS (Wisth et al. 2022) showed 67.6% and 26.7% reductions in translational errors, respectively. Additionally, MUSE outperformed DLIO (Chen et al. 2023), a LiDAR-inertial odometry system in rotational errors and frequency, while the proprioceptive version of MUSE (P-MUSE) outperformed TSIF [Bloesch et al. 2018], with a 45.9% reduction in absolute trajectory error (ATE).
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.