Tracking and classification of arbitrary objects with bottom-up/top-down detection

M. Himmelsbach, Hans-Joachim Wünsche
{"title":"Tracking and classification of arbitrary objects with bottom-up/top-down detection","authors":"M. Himmelsbach, Hans-Joachim Wünsche","doi":"10.1109/IVS.2012.6232181","DOIUrl":null,"url":null,"abstract":"Recently, the introduction of dense, long-range 3D sensors has facilitated tracking of arbitrary objects. Especially in the context of autonomous driving, other traffic participants driving the streets usually stay well-segmented from each other. In contrast, pedestrians or bicyclists do not always stay on the road and they often get close to static structure of the environment, e.g. traffic lights or signs, bushes, parking cars etc. These objects are not as easy to segment, often resulting in an under-segmentation of the scene and wrong tracking results. This paper addresses the problem of tracking moving objects that are hard to segment from their static surroundings by utilizing top-down knowledge about the geometry of existing tracks during segmentation. This includes methods for discerning static from moving objects to reduce the rate of false positive tracks as well as a classification of tracks into pedestrian, bicyclist, motor bike, passenger car, van and truck classes by considering an objects appearance and motion history. The proposed tracking system is experimentally validated in challenging real-world inner-city traffic scenes.","PeriodicalId":402389,"journal":{"name":"2012 IEEE Intelligent Vehicles Symposium","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-06-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"46","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Intelligent Vehicles Symposium","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IVS.2012.6232181","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 46

Abstract

Recently, the introduction of dense, long-range 3D sensors has facilitated tracking of arbitrary objects. Especially in the context of autonomous driving, other traffic participants driving the streets usually stay well-segmented from each other. In contrast, pedestrians or bicyclists do not always stay on the road and they often get close to static structure of the environment, e.g. traffic lights or signs, bushes, parking cars etc. These objects are not as easy to segment, often resulting in an under-segmentation of the scene and wrong tracking results. This paper addresses the problem of tracking moving objects that are hard to segment from their static surroundings by utilizing top-down knowledge about the geometry of existing tracks during segmentation. This includes methods for discerning static from moving objects to reduce the rate of false positive tracks as well as a classification of tracks into pedestrian, bicyclist, motor bike, passenger car, van and truck classes by considering an objects appearance and motion history. The proposed tracking system is experimentally validated in challenging real-world inner-city traffic scenes.
基于自底向上/自顶向下检测的任意物体跟踪和分类
最近,密集的远程3D传感器的引入促进了对任意物体的跟踪。特别是在自动驾驶的背景下,街道上的其他交通参与者通常彼此保持良好的分割。相比之下,行人或骑自行车的人并不总是停留在道路上,他们经常接近环境的静态结构,例如交通灯或标志,灌木丛,停车等。这些对象不容易分割,经常导致场景的分割不足和错误的跟踪结果。本文通过在分割过程中利用现有轨迹的几何知识,解决了难以从静态环境中分割出来的运动物体的跟踪问题。这包括从移动对象中识别静态对象的方法,以减少假阳性轨道的比率,以及通过考虑对象的外观和运动历史将轨道分类为行人,自行车,摩托车,乘用车,面包车和卡车类。所提出的跟踪系统在具有挑战性的真实城市交通场景中进行了实验验证。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信