Liam Boyle;Jonas Kühne;Nicolas Baumann;Niklas Bastuck;Michele Magno
{"title":"Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow","authors":"Liam Boyle;Jonas Kühne;Nicolas Baumann;Niklas Bastuck;Michele Magno","doi":"10.1109/LRA.2025.3576070","DOIUrl":null,"url":null,"abstract":"Accurate velocity estimation is critical in mobile robotics, particularly for driver assistance systems and autonomous driving. Wheel odometry fused with Inertial Measurement Unit (IMU) data is a widely used method for velocity estimation, however, it typically requires strong assumptions, such as non-slip steering, or complex vehicle dynamics models that do not hold under varying environmental conditions, like slippery surfaces. We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions by leveraging planar kinematics in combination with optical flow from event cameras pointed perpendicularly at the ground. The asynchronous <inline-formula><tex-math>$\\mu$</tex-math></inline-formula>-second latency and high dynamic range of event cameras make them highly robust to motion blur, a common challenge in vision-based perception techniques for autonomous driving. The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform and compared to precise motion capture data demonstrating not only performance on par with the State-of-the-Art Event-VIO method but also a 38.3% improvement in lateral error. Qualitative experiments at highway speeds of up to <inline-formula><tex-math>${\\text{32}} \\,{\\text{m}}/{\\text{s}}$</tex-math></inline-formula> further confirm the effectiveness of our approach, indicating significant potential for real-world deployment.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 7","pages":"7318-7325"},"PeriodicalIF":4.6000,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11021382/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Accurate velocity estimation is critical in mobile robotics, particularly for driver assistance systems and autonomous driving. Wheel odometry fused with Inertial Measurement Unit (IMU) data is a widely used method for velocity estimation, however, it typically requires strong assumptions, such as non-slip steering, or complex vehicle dynamics models that do not hold under varying environmental conditions, like slippery surfaces. We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions by leveraging planar kinematics in combination with optical flow from event cameras pointed perpendicularly at the ground. The asynchronous $\mu$-second latency and high dynamic range of event cameras make them highly robust to motion blur, a common challenge in vision-based perception techniques for autonomous driving. The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform and compared to precise motion capture data demonstrating not only performance on par with the State-of-the-Art Event-VIO method but also a 38.3% improvement in lateral error. Qualitative experiments at highway speeds of up to ${\text{32}} \,{\text{m}}/{\text{s}}$ further confirm the effectiveness of our approach, indicating significant potential for real-world deployment.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.