Pedestrian Lane Detection in Unstructured Environments for Assistive Navigation

M. Le, S. L. Phung, A. Bouzerdoum
{"title":"Pedestrian Lane Detection in Unstructured Environments for Assistive Navigation","authors":"M. Le, S. L. Phung, A. Bouzerdoum","doi":"10.1109/DICTA.2014.7008122","DOIUrl":null,"url":null,"abstract":"Automatically finding paths is a crucial and challenging task in autonomous navigation systems. The task becomes more difficult in unstructured environments such as indoor or outdoor scenes with unmarked pedestrian lanes under severe illumination conditions, complex lane surface structures, and occlusion. This paper proposes a robust method for pedestrian lane detection in such unstructured environments. The proposed method detects the walking lane in a probabilistic framework integrating both appearance of the lane region and characteristics of the lane borders. The vanishing point is employed to identify the lane borders. We propose an improved vanishing point estimation method based on orientation of color edges, and use pedestrian detection for occlusion handling. The proposed pedestrian lane detection method is evaluated on a new data set of 2000 images collected from various indoor and outdoor scenes with different types of unmarked lanes. Experimental results and comparisons with other existing methods on the new data set have shown the efficiency and robustness of the proposed method.","PeriodicalId":146695,"journal":{"name":"2014 International Conference on Digital Image Computing: Techniques and Applications (DICTA)","volume":"115 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 International Conference on Digital Image Computing: Techniques and Applications (DICTA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/DICTA.2014.7008122","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

Abstract

Automatically finding paths is a crucial and challenging task in autonomous navigation systems. The task becomes more difficult in unstructured environments such as indoor or outdoor scenes with unmarked pedestrian lanes under severe illumination conditions, complex lane surface structures, and occlusion. This paper proposes a robust method for pedestrian lane detection in such unstructured environments. The proposed method detects the walking lane in a probabilistic framework integrating both appearance of the lane region and characteristics of the lane borders. The vanishing point is employed to identify the lane borders. We propose an improved vanishing point estimation method based on orientation of color edges, and use pedestrian detection for occlusion handling. The proposed pedestrian lane detection method is evaluated on a new data set of 2000 images collected from various indoor and outdoor scenes with different types of unmarked lanes. Experimental results and comparisons with other existing methods on the new data set have shown the efficiency and robustness of the proposed method.
面向辅助导航的非结构化环境行人车道检测
在自主导航系统中,自动寻径是一项至关重要且具有挑战性的任务。在非结构化环境中,如室内或室外,在光照条件恶劣、车道表面结构复杂和遮挡的情况下,没有标记的行人车道,任务变得更加困难。本文提出了一种鲁棒的非结构化环境下行人车道检测方法。该方法在结合车道区域外观和车道边界特征的概率框架中检测步行车道。利用消失点来识别车道边界。我们提出了一种改进的基于颜色边缘方向的消失点估计方法,并利用行人检测进行遮挡处理。在2000张不同类型无标记车道的室内和室外场景图像的新数据集上,对所提出的行人车道检测方法进行了评估。在新数据集上的实验结果和与其他现有方法的比较表明了该方法的有效性和鲁棒性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信