Insufficient environmental information indoor localization of mecanum mobile platform using wheel-visual-inertial odometry

IF 1.5 4区 工程技术 Q3 ENGINEERING, MECHANICAL
Chaehyun Lee, Seongyong Hur, David Kim, Yoseph Yang, Dongil Choi
{"title":"Insufficient environmental information indoor localization of mecanum mobile platform using wheel-visual-inertial odometry","authors":"Chaehyun Lee, Seongyong Hur, David Kim, Yoseph Yang, Dongil Choi","doi":"10.1007/s12206-024-0836-z","DOIUrl":null,"url":null,"abstract":"<p>In autonomous driving of the mobile robot, the robot’s current location should be identified first to plan and move a path to the target location. Accordingly, research on the robot’s localization using GPS, 3D LiDAR, and Vision has been actively conducted. However, there is a limitation in that it is difficult to locate robots in indoor spaces where signals are disturbed by walls or ceilings, or in areas where sufficient environmental information cannot be obtained. This paper introduces the robot’s position estimation method to overcome these environmental problems by using sensor fusion in an indoor tennis court. We propose a localization method that has low latency performance and high location accuracy through the use of Kalman filters to fuse data from wheel odometry and visual-inertial odometry. To evaluate its performance, this method was compared against wheel odometry, visual-inertial odometry, and LIO-SAM after the robot completed three rectangular paths. The resultant mean absolute errors in the x and y directions were 1.908 m and 0.707 m for wheel odometry, 1.169 m and 1.430 m for visual-inertial odometry, and 0.400 m and 0.383 m for LIO-SAM, respectively. In contrast, the wheel-visual-inertial odometry introduced in this study reported errors of 0.209 m and 0.103 m in the x and y directions, respectively, indicating superior accuracy compared to the other algorithms. This underscores the effectiveness of the proposed method in indoor environments where signals can be obstructed by walls or ceilings, or in areas lacking abundant environmental information.</p>","PeriodicalId":16235,"journal":{"name":"Journal of Mechanical Science and Technology","volume":"42 1","pages":""},"PeriodicalIF":1.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Mechanical Science and Technology","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1007/s12206-024-0836-z","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, MECHANICAL","Score":null,"Total":0}
引用次数: 0

Abstract

In autonomous driving of the mobile robot, the robot’s current location should be identified first to plan and move a path to the target location. Accordingly, research on the robot’s localization using GPS, 3D LiDAR, and Vision has been actively conducted. However, there is a limitation in that it is difficult to locate robots in indoor spaces where signals are disturbed by walls or ceilings, or in areas where sufficient environmental information cannot be obtained. This paper introduces the robot’s position estimation method to overcome these environmental problems by using sensor fusion in an indoor tennis court. We propose a localization method that has low latency performance and high location accuracy through the use of Kalman filters to fuse data from wheel odometry and visual-inertial odometry. To evaluate its performance, this method was compared against wheel odometry, visual-inertial odometry, and LIO-SAM after the robot completed three rectangular paths. The resultant mean absolute errors in the x and y directions were 1.908 m and 0.707 m for wheel odometry, 1.169 m and 1.430 m for visual-inertial odometry, and 0.400 m and 0.383 m for LIO-SAM, respectively. In contrast, the wheel-visual-inertial odometry introduced in this study reported errors of 0.209 m and 0.103 m in the x and y directions, respectively, indicating superior accuracy compared to the other algorithms. This underscores the effectiveness of the proposed method in indoor environments where signals can be obstructed by walls or ceilings, or in areas lacking abundant environmental information.

利用轮子-视觉-惯性里程测量法对环境信息不足的 mecanum 移动平台进行室内定位
在移动机器人的自主驾驶过程中,首先要确定机器人的当前位置,才能规划并移动到目标位置的路径。因此,利用全球定位系统、三维激光雷达和视觉技术对机器人进行定位的研究一直在积极开展。然而,在室内空间,信号会受到墙壁或天花板的干扰,或者在无法获得足够环境信息的区域,机器人定位存在一定的局限性。本文介绍了在室内网球场利用传感器融合来克服这些环境问题的机器人位置估计方法。我们提出了一种具有低延迟性能和高定位精度的定位方法,通过使用卡尔曼滤波器来融合来自轮子里程计和视觉惯性里程计的数据。为了评估该方法的性能,我们在机器人完成三条矩形路径后,将其与车轮里程计、视觉惯性里程计和 LIO-SAM 进行了比较。结果,轮子测距法在 x 和 y 方向的平均绝对误差分别为 1.908 米和 0.707 米,视觉惯性测距法为 1.169 米和 1.430 米,LIO-SAM 为 0.400 米和 0.383 米。相比之下,本研究采用的车轮-视觉-惯性里程计在 x 和 y 方向的误差分别为 0.209 米和 0.103 米,表明其精度优于其他算法。这凸显了拟议方法在室内环境中的有效性,因为在室内环境中,信号可能会受到墙壁或天花板的阻挡,或者在缺乏丰富环境信息的区域。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Journal of Mechanical Science and Technology
Journal of Mechanical Science and Technology 工程技术-工程:机械
CiteScore
2.90
自引率
6.20%
发文量
517
审稿时长
7.7 months
期刊介绍: The aim of the Journal of Mechanical Science and Technology is to provide an international forum for the publication and dissemination of original work that contributes to the understanding of the main and related disciplines of mechanical engineering, either empirical or theoretical. The Journal covers the whole spectrum of mechanical engineering, which includes, but is not limited to, Materials and Design Engineering, Production Engineering and Fusion Technology, Dynamics, Vibration and Control, Thermal Engineering and Fluids Engineering. Manuscripts may fall into several categories including full articles, solicited reviews or commentary, and unsolicited reviews or commentary related to the core of mechanical engineering.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信