A fast probabilistic ego-motion estimation framework for radar

M. Rapp, M. Barjenbruch, K. Dietmayer, Markus Hahn, J. Dickmann
{"title":"A fast probabilistic ego-motion estimation framework for radar","authors":"M. Rapp, M. Barjenbruch, K. Dietmayer, Markus Hahn, J. Dickmann","doi":"10.1109/ECMR.2015.7324046","DOIUrl":null,"url":null,"abstract":"This paper presents a fast, joint spatial- and Doppler velocity-based, probabilistic approach for ego-motion estimation for single and multiple radar-equipped robots. The normal distribution transform is used for the fast and accurate position matching of consecutive radar detections. This registration technique is successfully applied to laser-based scan matching. To overcome discontinuities of the original normal distribution approach, an appropriate clustering technique provides a globally smooth mixed-Gaussian representation. It is shown how this matching approach can be significantly improved by taking the Doppler information into account. The Doppler information is used in a density-based approach to extend the position matching to a joint likelihood optimization function. Then, the estimated ego-motion maximizes this function. Large-scale real world experiments in an urban environment using a 77 GHz radar show the robust and accurate ego-motion estimation of the proposed algorithm. In the experiments, comparisons are made to state-of-the-art algorithms, the vehicle odometry, and a high-precision inertial measurement unit.","PeriodicalId":142754,"journal":{"name":"2015 European Conference on Mobile Robots (ECMR)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 European Conference on Mobile Robots (ECMR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECMR.2015.7324046","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27

Abstract

This paper presents a fast, joint spatial- and Doppler velocity-based, probabilistic approach for ego-motion estimation for single and multiple radar-equipped robots. The normal distribution transform is used for the fast and accurate position matching of consecutive radar detections. This registration technique is successfully applied to laser-based scan matching. To overcome discontinuities of the original normal distribution approach, an appropriate clustering technique provides a globally smooth mixed-Gaussian representation. It is shown how this matching approach can be significantly improved by taking the Doppler information into account. The Doppler information is used in a density-based approach to extend the position matching to a joint likelihood optimization function. Then, the estimated ego-motion maximizes this function. Large-scale real world experiments in an urban environment using a 77 GHz radar show the robust and accurate ego-motion estimation of the proposed algorithm. In the experiments, comparisons are made to state-of-the-art algorithms, the vehicle odometry, and a high-precision inertial measurement unit.
雷达自运动快速概率估计框架
本文提出了一种快速、基于联合空间和多普勒速度的概率方法,用于单个和多个装备雷达的机器人的自我运动估计。采用正态分布变换实现了连续雷达探测的快速准确位置匹配。该配准技术已成功应用于激光扫描匹配中。为了克服原始正态分布方法的不连续性,一种合适的聚类技术提供了全局光滑的混合高斯表示。通过考虑多普勒信息,可以显著改善这种匹配方法。在基于密度的方法中利用多普勒信息将位置匹配扩展为联合似然优化函数。然后,估计的自我运动最大化这个函数。在城市环境下使用77 GHz雷达进行的大规模真实世界实验表明,该算法具有鲁棒性和准确性。在实验中,与最先进的算法、车辆里程计和高精度惯性测量单元进行了比较。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信