利用点云中的u -视差进行障碍物检测

Yang Wei, Chen Gong, Shuo Chen
{"title":"利用点云中的u -视差进行障碍物检测","authors":"Yang Wei, Chen Gong, Shuo Chen","doi":"10.1109/ACPR.2017.106","DOIUrl":null,"url":null,"abstract":"Lidar is an indispensable equipment in autonomous vehicle for environment perception, in which obstacle detection plays an important role in collision avoiding and route planning. The main challenge of Lidar based obstacle detection is that processing the disordered and sparse point clouds would be difficult and time-consuming. Hence, this paper presents a novel usage of U-disparity to locate obstacles indiscriminately with point clouds, which makes obstacle detection effective and efficient. The proposed method firstly uses cross-calibration to align point cloud with reference image, so that a depth map is formed. Then, the U-disparity map is introduced to process Lidar based depth map. Due to the particularity of Lidar based U-disparity, we select local peaks in column in U-disparity to identify relevant disparities of obstacles. After applying filtering and clustering steps on these salient peak disparities, the corresponding obstacles can be precisely localized. Quantitative and qualitative experimental results on KITTI object detection benchmark and road detection benchmark reveal that the proposed method achieves very encouraging performances in various environments.","PeriodicalId":426561,"journal":{"name":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","volume":"95 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Harnessing U-disparity in Point Clouds for Obstacle Detection\",\"authors\":\"Yang Wei, Chen Gong, Shuo Chen\",\"doi\":\"10.1109/ACPR.2017.106\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Lidar is an indispensable equipment in autonomous vehicle for environment perception, in which obstacle detection plays an important role in collision avoiding and route planning. The main challenge of Lidar based obstacle detection is that processing the disordered and sparse point clouds would be difficult and time-consuming. Hence, this paper presents a novel usage of U-disparity to locate obstacles indiscriminately with point clouds, which makes obstacle detection effective and efficient. The proposed method firstly uses cross-calibration to align point cloud with reference image, so that a depth map is formed. Then, the U-disparity map is introduced to process Lidar based depth map. Due to the particularity of Lidar based U-disparity, we select local peaks in column in U-disparity to identify relevant disparities of obstacles. After applying filtering and clustering steps on these salient peak disparities, the corresponding obstacles can be precisely localized. Quantitative and qualitative experimental results on KITTI object detection benchmark and road detection benchmark reveal that the proposed method achieves very encouraging performances in various environments.\",\"PeriodicalId\":426561,\"journal\":{\"name\":\"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)\",\"volume\":\"95 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ACPR.2017.106\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 4th IAPR Asian Conference on Pattern Recognition (ACPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2017.106","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

激光雷达是自动驾驶汽车中必不可少的环境感知设备,其中障碍物检测在避碰和路径规划中起着重要作用。基于激光雷达的障碍物检测面临的主要挑战是处理无序和稀疏的点云将是困难和耗时的。因此,本文提出了一种新颖的利用u -视差利用点云对障碍物进行无差别定位的方法,提高了障碍物检测的有效性和效率。该方法首先通过交叉标定将点云与参考图像对齐,形成深度图;然后,引入u -视差图来处理基于激光雷达的深度图。由于基于激光雷达的u -视差的特殊性,我们选择u -视差列中的局部峰值来识别相关障碍物的视差。对这些显著的峰差进行滤波和聚类后,可以精确地定位相应的障碍物。在KITTI目标检测基准和道路检测基准上的定量和定性实验结果表明,该方法在各种环境下都取得了令人鼓舞的性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Harnessing U-disparity in Point Clouds for Obstacle Detection
Lidar is an indispensable equipment in autonomous vehicle for environment perception, in which obstacle detection plays an important role in collision avoiding and route planning. The main challenge of Lidar based obstacle detection is that processing the disordered and sparse point clouds would be difficult and time-consuming. Hence, this paper presents a novel usage of U-disparity to locate obstacles indiscriminately with point clouds, which makes obstacle detection effective and efficient. The proposed method firstly uses cross-calibration to align point cloud with reference image, so that a depth map is formed. Then, the U-disparity map is introduced to process Lidar based depth map. Due to the particularity of Lidar based U-disparity, we select local peaks in column in U-disparity to identify relevant disparities of obstacles. After applying filtering and clustering steps on these salient peak disparities, the corresponding obstacles can be precisely localized. Quantitative and qualitative experimental results on KITTI object detection benchmark and road detection benchmark reveal that the proposed method achieves very encouraging performances in various environments.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信