A New Method for Object Tracking Based on Regions Instead of Contours

N. Gómez, R. Alquézar, F. Serratosa
{"title":"A New Method for Object Tracking Based on Regions Instead of Contours","authors":"N. Gómez, R. Alquézar, F. Serratosa","doi":"10.1109/CVPR.2007.383454","DOIUrl":null,"url":null,"abstract":"This paper presents a new method for object tracking in video sequences that is especially suitable in very noisy environments. In such situations, segmented images from one frame to the next one are usually so different that it is very hard or even impossible to match the corresponding regions or contours of both images. With the aim of tracking objects in these situations, our approach has two main characteristics. On one hand, we assume that the tracking approaches based on contours cannot be applied, and therefore, our system uses object recognition results computed from regions (specifically, colour spots from segmented images). On the other hand, we discard to match the spots of consecutive segmented images and, consequently, the methods that represent the objects by structures such as graphs or skeletons, since the structures obtained may be too different in consecutive frames. Thus, we represent the location of tracked objects through images of probabilities that are updated dynamically using both recognition and tracking results in previous steps. From these probabilities and a simple prediction of the apparent motion of the object in the image, a binary decision can be made for each pixel and abject.","PeriodicalId":351008,"journal":{"name":"2007 IEEE Conference on Computer Vision and Pattern Recognition","volume":"125 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-06-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE Conference on Computer Vision and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.2007.383454","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

This paper presents a new method for object tracking in video sequences that is especially suitable in very noisy environments. In such situations, segmented images from one frame to the next one are usually so different that it is very hard or even impossible to match the corresponding regions or contours of both images. With the aim of tracking objects in these situations, our approach has two main characteristics. On one hand, we assume that the tracking approaches based on contours cannot be applied, and therefore, our system uses object recognition results computed from regions (specifically, colour spots from segmented images). On the other hand, we discard to match the spots of consecutive segmented images and, consequently, the methods that represent the objects by structures such as graphs or skeletons, since the structures obtained may be too different in consecutive frames. Thus, we represent the location of tracked objects through images of probabilities that are updated dynamically using both recognition and tracking results in previous steps. From these probabilities and a simple prediction of the apparent motion of the object in the image, a binary decision can be made for each pixel and abject.
一种基于区域代替轮廓的目标跟踪新方法
本文提出了一种新的视频序列目标跟踪方法,特别适用于非常嘈杂的环境。在这种情况下,从一帧到下一帧的分割图像通常是如此不同,以至于很难甚至不可能匹配两个图像的相应区域或轮廓。为了在这些情况下跟踪目标,我们的方法有两个主要特点。一方面,我们假设不能应用基于轮廓的跟踪方法,因此,我们的系统使用从区域计算的对象识别结果(特别是来自分割图像的色点)。另一方面,我们放弃匹配连续分割图像的点,因此,用图或骨架等结构表示对象的方法,因为在连续帧中获得的结构可能差异太大。因此,我们通过使用前面步骤中识别和跟踪结果动态更新的概率图像来表示跟踪对象的位置。根据这些概率和对图像中物体的视运动的简单预测,可以对每个像素和物体进行二值化决策。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信