Tunable Kernels for Tracking

Vasu Parameswaran, Visvanathan Ramesh, Imad Zoghlami
{"title":"Tunable Kernels for Tracking","authors":"Vasu Parameswaran, Visvanathan Ramesh, Imad Zoghlami","doi":"10.1109/CVPR.2006.317","DOIUrl":null,"url":null,"abstract":"We present a tunable representation for tracking that simultaneously encodes appearance and geometry in a manner that enables the use of mean-shift iterations for tracking. The classic formulation of the tracking problem using mean-shift iterations encodes spatial information very loosely (i.e. using radially symmetric kernels). A problem with such a formulation is that it becomes easy for the tracker to get confused with other objects having the same feature distribution but different spatial configurations of features. Subsequent approaches have addressed this issue but not to the degree of generality required for tracking specific classes of objects and motions (e.g. humans walking). In this paper, we formulate the tracking problem in a manner that encodes the spatial configuration of features along with their density and yet retains robustness to spatial deformations and feature density variations. The encoding of spatial configuration is done using a set of kernels whose parameters can be optimized for a given class of objects and motions, off-line. The formulation enables the use of meanshift iterations and runs in real-time. We demonstrate better tracking results on synthetic and real image sequences as compared to the original mean-shift tracker.","PeriodicalId":421737,"journal":{"name":"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"49","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR'06)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.2006.317","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 49

Abstract

We present a tunable representation for tracking that simultaneously encodes appearance and geometry in a manner that enables the use of mean-shift iterations for tracking. The classic formulation of the tracking problem using mean-shift iterations encodes spatial information very loosely (i.e. using radially symmetric kernels). A problem with such a formulation is that it becomes easy for the tracker to get confused with other objects having the same feature distribution but different spatial configurations of features. Subsequent approaches have addressed this issue but not to the degree of generality required for tracking specific classes of objects and motions (e.g. humans walking). In this paper, we formulate the tracking problem in a manner that encodes the spatial configuration of features along with their density and yet retains robustness to spatial deformations and feature density variations. The encoding of spatial configuration is done using a set of kernels whose parameters can be optimized for a given class of objects and motions, off-line. The formulation enables the use of meanshift iterations and runs in real-time. We demonstrate better tracking results on synthetic and real image sequences as compared to the original mean-shift tracker.
用于跟踪的可调内核
我们提出了一种可调的跟踪表示,它同时以一种允许使用mean-shift迭代进行跟踪的方式编码外观和几何形状。使用均值移位迭代的跟踪问题的经典公式非常松散地编码空间信息(即使用径向对称核)。这种公式的一个问题是,跟踪器很容易与具有相同特征分布但特征空间配置不同的其他对象混淆。随后的方法已经解决了这个问题,但没有达到跟踪特定类别的物体和运动(例如人类行走)所需的通用性程度。在本文中,我们以一种对特征的空间配置及其密度进行编码的方式来表述跟踪问题,同时保持对空间变形和特征密度变化的鲁棒性。空间配置的编码是使用一组内核完成的,这些内核的参数可以针对给定的对象和运动类进行离线优化。该公式允许使用meanshift迭代并实时运行。与原始均值移位跟踪器相比,我们在合成和真实图像序列上展示了更好的跟踪结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信