用于农业无人地面车辆的高效自适应激光雷达-视觉-惯性里程计

IF 2.3 4区 计算机科学 Q2 Computer Science
Zixuan Zhao, Yucheng Zhang, Long Long, Zaiwang Lu, Jinglin Shi
{"title":"用于农业无人地面车辆的高效自适应激光雷达-视觉-惯性里程计","authors":"Zixuan Zhao, Yucheng Zhang, Long Long, Zaiwang Lu, Jinglin Shi","doi":"10.1177/17298806221094925","DOIUrl":null,"url":null,"abstract":"The accuracy of agricultural unmanned ground vehicles’ localization directly affects the accuracy of their navigation. However, due to the changeable environment and fewer features in the agricultural scene, it is challenging for these unmanned ground vehicles to localize precisely in global positioning system-denied areas with a single sensor. In this article, we present an efficient and adaptive sensor-fusion odometry framework based on simultaneous localization and mapping to handle the localization problems of agricultural unmanned ground vehicles without the assistance of a global positioning system. The framework leverages three kinds of sub-odometry (lidar odometry, visual odometry and inertial odometry) and automatically combines them depending on the environment to provide accurate pose estimation in real time. The combination of sub-odometry is implemented by trading off the robustness and the accuracy of pose estimation. The efficiency and adaptability are mainly reflected in the novel surfel-based iterative closest point method for lidar odometry we propose, which utilizes the changeable surfel radius range and the adaptive iterative closest point initialization to improve the accuracy of pose estimation in different environments. We test our system in various agricultural unmanned ground vehicles’ working zones and some other open data sets, and the results prove that the proposed method shows better performance mainly in accuracy, efficiency and robustness, compared with the state-of-art methods.","PeriodicalId":50343,"journal":{"name":"International Journal of Advanced Robotic Systems","volume":" ","pages":""},"PeriodicalIF":2.3000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Efficient and adaptive lidar–visual–inertial odometry for agricultural unmanned ground vehicle\",\"authors\":\"Zixuan Zhao, Yucheng Zhang, Long Long, Zaiwang Lu, Jinglin Shi\",\"doi\":\"10.1177/17298806221094925\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The accuracy of agricultural unmanned ground vehicles’ localization directly affects the accuracy of their navigation. However, due to the changeable environment and fewer features in the agricultural scene, it is challenging for these unmanned ground vehicles to localize precisely in global positioning system-denied areas with a single sensor. In this article, we present an efficient and adaptive sensor-fusion odometry framework based on simultaneous localization and mapping to handle the localization problems of agricultural unmanned ground vehicles without the assistance of a global positioning system. The framework leverages three kinds of sub-odometry (lidar odometry, visual odometry and inertial odometry) and automatically combines them depending on the environment to provide accurate pose estimation in real time. The combination of sub-odometry is implemented by trading off the robustness and the accuracy of pose estimation. The efficiency and adaptability are mainly reflected in the novel surfel-based iterative closest point method for lidar odometry we propose, which utilizes the changeable surfel radius range and the adaptive iterative closest point initialization to improve the accuracy of pose estimation in different environments. We test our system in various agricultural unmanned ground vehicles’ working zones and some other open data sets, and the results prove that the proposed method shows better performance mainly in accuracy, efficiency and robustness, compared with the state-of-art methods.\",\"PeriodicalId\":50343,\"journal\":{\"name\":\"International Journal of Advanced Robotic Systems\",\"volume\":\" \",\"pages\":\"\"},\"PeriodicalIF\":2.3000,\"publicationDate\":\"2022-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Advanced Robotic Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1177/17298806221094925\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"Computer Science\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Advanced Robotic Systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1177/17298806221094925","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"Computer Science","Score":null,"Total":0}
引用次数: 6

摘要

农业无人地面车的定位精度直接影响其导航精度。然而,由于环境多变,农业场景中的特征较少,这些无人地面车辆很难用单个传感器在全球定位系统拒绝的区域进行精确定位。在本文中,我们提出了一种基于同时定位和映射的高效自适应传感器融合里程计框架,以在没有全球定位系统帮助的情况下处理农业无人地面车辆的定位问题。该框架利用了三种亚里程计(激光雷达里程计、视觉里程计和惯性里程计),并根据环境自动组合它们,以实时提供准确的姿态估计。亚里程计的组合是通过权衡姿态估计的鲁棒性和准确性来实现的。效率和适应性主要体现在我们提出的新的基于surfel的迭代最接近点的激光雷达里程计方法中,该方法利用可变的surfel半径范围和自适应迭代最近点初始化来提高不同环境下的姿态估计精度。我们在各种农业无人地面车辆的工作区域和其他一些开放数据集中测试了我们的系统,结果证明,与现有技术相比,所提出的方法主要在准确性、效率和鲁棒性方面表现出更好的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Efficient and adaptive lidar–visual–inertial odometry for agricultural unmanned ground vehicle
The accuracy of agricultural unmanned ground vehicles’ localization directly affects the accuracy of their navigation. However, due to the changeable environment and fewer features in the agricultural scene, it is challenging for these unmanned ground vehicles to localize precisely in global positioning system-denied areas with a single sensor. In this article, we present an efficient and adaptive sensor-fusion odometry framework based on simultaneous localization and mapping to handle the localization problems of agricultural unmanned ground vehicles without the assistance of a global positioning system. The framework leverages three kinds of sub-odometry (lidar odometry, visual odometry and inertial odometry) and automatically combines them depending on the environment to provide accurate pose estimation in real time. The combination of sub-odometry is implemented by trading off the robustness and the accuracy of pose estimation. The efficiency and adaptability are mainly reflected in the novel surfel-based iterative closest point method for lidar odometry we propose, which utilizes the changeable surfel radius range and the adaptive iterative closest point initialization to improve the accuracy of pose estimation in different environments. We test our system in various agricultural unmanned ground vehicles’ working zones and some other open data sets, and the results prove that the proposed method shows better performance mainly in accuracy, efficiency and robustness, compared with the state-of-art methods.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
6.50
自引率
0.00%
发文量
65
审稿时长
6 months
期刊介绍: International Journal of Advanced Robotic Systems (IJARS) is a JCR ranked, peer-reviewed open access journal covering the full spectrum of robotics research. The journal is addressed to both practicing professionals and researchers in the field of robotics and its specialty areas. IJARS features fourteen topic areas each headed by a Topic Editor-in-Chief, integrating all aspects of research in robotics under the journal''s domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信