Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow

IF 4.6 2区 计算机科学 Q2 ROBOTICS
Liam Boyle;Jonas Kühne;Nicolas Baumann;Niklas Bastuck;Michele Magno
{"title":"Planar Velocity Estimation for Fast-Moving Mobile Robots Using Event-Based Optical Flow","authors":"Liam Boyle;Jonas Kühne;Nicolas Baumann;Niklas Bastuck;Michele Magno","doi":"10.1109/LRA.2025.3576070","DOIUrl":null,"url":null,"abstract":"Accurate velocity estimation is critical in mobile robotics, particularly for driver assistance systems and autonomous driving. Wheel odometry fused with Inertial Measurement Unit (IMU) data is a widely used method for velocity estimation, however, it typically requires strong assumptions, such as non-slip steering, or complex vehicle dynamics models that do not hold under varying environmental conditions, like slippery surfaces. We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions by leveraging planar kinematics in combination with optical flow from event cameras pointed perpendicularly at the ground. The asynchronous <inline-formula><tex-math>$\\mu$</tex-math></inline-formula>-second latency and high dynamic range of event cameras make them highly robust to motion blur, a common challenge in vision-based perception techniques for autonomous driving. The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform and compared to precise motion capture data demonstrating not only performance on par with the State-of-the-Art Event-VIO method but also a 38.3% improvement in lateral error. Qualitative experiments at highway speeds of up to <inline-formula><tex-math>${\\text{32}} \\,{\\text{m}}/{\\text{s}}$</tex-math></inline-formula> further confirm the effectiveness of our approach, indicating significant potential for real-world deployment.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 7","pages":"7318-7325"},"PeriodicalIF":4.6000,"publicationDate":"2025-06-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11021382/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Accurate velocity estimation is critical in mobile robotics, particularly for driver assistance systems and autonomous driving. Wheel odometry fused with Inertial Measurement Unit (IMU) data is a widely used method for velocity estimation, however, it typically requires strong assumptions, such as non-slip steering, or complex vehicle dynamics models that do not hold under varying environmental conditions, like slippery surfaces. We introduce an approach to velocity estimation that is decoupled from wheel-to-surface traction assumptions by leveraging planar kinematics in combination with optical flow from event cameras pointed perpendicularly at the ground. The asynchronous $\mu$-second latency and high dynamic range of event cameras make them highly robust to motion blur, a common challenge in vision-based perception techniques for autonomous driving. The proposed method is evaluated through in-field experiments on a 1:10 scale autonomous racing platform and compared to precise motion capture data demonstrating not only performance on par with the State-of-the-Art Event-VIO method but also a 38.3% improvement in lateral error. Qualitative experiments at highway speeds of up to ${\text{32}} \,{\text{m}}/{\text{s}}$ further confirm the effectiveness of our approach, indicating significant potential for real-world deployment.
基于事件光流的快速移动机器人平面速度估计
准确的速度估计对于移动机器人来说至关重要,特别是对于驾驶员辅助系统和自动驾驶来说。车轮里程计与惯性测量单元(IMU)数据融合是一种广泛使用的速度估计方法,然而,它通常需要很强的假设,例如防滑转向,或者复杂的车辆动力学模型,这些模型在不同的环境条件下(如光滑的表面)不成立。我们引入了一种速度估计方法,该方法通过利用平面运动学与垂直指向地面的事件相机的光流相结合,从车轮到地面的牵引力假设中解耦。事件相机的异步延迟和高动态范围使其对运动模糊具有高度鲁棒性,运动模糊是自动驾驶基于视觉的感知技术的常见挑战。在1:10比例的自动驾驶赛车平台上进行了现场实验,并与精确的运动捕捉数据进行了比较,结果表明,该方法不仅性能与最先进的Event-VIO方法相当,而且横向误差也提高了38.3%。高速公路速度高达${\text{32}} \,{\text{m}}/{\text{s}}$的定性实验进一步证实了我们方法的有效性,表明在现实世界中部署的巨大潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Robotics and Automation Letters
IEEE Robotics and Automation Letters Computer Science-Computer Science Applications
CiteScore
9.60
自引率
15.40%
发文量
1428
期刊介绍: The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信