{"title":"用移动相机跟踪多个彩色斑点","authors":"Antonis A. Argyros, Manolis I. A. Lourakis","doi":"10.1109/CVPR.2005.348","DOIUrl":null,"url":null,"abstract":"This paper concerns a method for tracking multiple blobs exhibiting certain color distributions in images acquired by a possibly moving camera. The method encompasses a collection of techniques that enable modeling and detecting the blobs possessing the desired color distribution(s), as well as inferring their temporal association across image sequences. Appropriately colored blobs are detected with a Bayesian classifier, which is bootstrapped with a small set of training data. Then, an online iterative training procedure is employed to refine the classifier using additional training images. Online adaptation of color probabilities is used to enable the classifier to cope with illumination changes. Tracking over time is realized through a novel technique, which can handle multiple colored blobs. Such blobs may move in complex trajectories and occlude each other in the field of view of a possibly moving camera, while their number may vary over time. A prototype implementation of the developed system running on a conventional Pentium IV processor at 2.5 GHz operates on 320/spl times/240 live video in real time (30Hz). It is worth pointing out that currently, the cycle time of the tracker is determined by the maximum acquisition frame rate that is supported by our IEEE 1394 camera, rather than the latency introduced by the computational overhead for tracking blobs.","PeriodicalId":89346,"journal":{"name":"Conference on Computer Vision and Pattern Recognition Workshops. IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Workshops","volume":"40 1","pages":"1178"},"PeriodicalIF":0.0000,"publicationDate":"2005-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"15","resultStr":"{\"title\":\"Tracking Multiple Colored Blobs with a Moving Camera\",\"authors\":\"Antonis A. Argyros, Manolis I. A. Lourakis\",\"doi\":\"10.1109/CVPR.2005.348\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper concerns a method for tracking multiple blobs exhibiting certain color distributions in images acquired by a possibly moving camera. The method encompasses a collection of techniques that enable modeling and detecting the blobs possessing the desired color distribution(s), as well as inferring their temporal association across image sequences. Appropriately colored blobs are detected with a Bayesian classifier, which is bootstrapped with a small set of training data. Then, an online iterative training procedure is employed to refine the classifier using additional training images. Online adaptation of color probabilities is used to enable the classifier to cope with illumination changes. Tracking over time is realized through a novel technique, which can handle multiple colored blobs. Such blobs may move in complex trajectories and occlude each other in the field of view of a possibly moving camera, while their number may vary over time. A prototype implementation of the developed system running on a conventional Pentium IV processor at 2.5 GHz operates on 320/spl times/240 live video in real time (30Hz). It is worth pointing out that currently, the cycle time of the tracker is determined by the maximum acquisition frame rate that is supported by our IEEE 1394 camera, rather than the latency introduced by the computational overhead for tracking blobs.\",\"PeriodicalId\":89346,\"journal\":{\"name\":\"Conference on Computer Vision and Pattern Recognition Workshops. IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Workshops\",\"volume\":\"40 1\",\"pages\":\"1178\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2005-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"15\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Conference on Computer Vision and Pattern Recognition Workshops. IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Workshops\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CVPR.2005.348\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Conference on Computer Vision and Pattern Recognition Workshops. IEEE Computer Society Conference on Computer Vision and Pattern Recognition. Workshops","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR.2005.348","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tracking Multiple Colored Blobs with a Moving Camera
This paper concerns a method for tracking multiple blobs exhibiting certain color distributions in images acquired by a possibly moving camera. The method encompasses a collection of techniques that enable modeling and detecting the blobs possessing the desired color distribution(s), as well as inferring their temporal association across image sequences. Appropriately colored blobs are detected with a Bayesian classifier, which is bootstrapped with a small set of training data. Then, an online iterative training procedure is employed to refine the classifier using additional training images. Online adaptation of color probabilities is used to enable the classifier to cope with illumination changes. Tracking over time is realized through a novel technique, which can handle multiple colored blobs. Such blobs may move in complex trajectories and occlude each other in the field of view of a possibly moving camera, while their number may vary over time. A prototype implementation of the developed system running on a conventional Pentium IV processor at 2.5 GHz operates on 320/spl times/240 live video in real time (30Hz). It is worth pointing out that currently, the cycle time of the tracker is determined by the maximum acquisition frame rate that is supported by our IEEE 1394 camera, rather than the latency introduced by the computational overhead for tracking blobs.