People tracking across two distant self-calibrated cameras

R. Pflugfelder, H. Bischof
{"title":"People tracking across two distant self-calibrated cameras","authors":"R. Pflugfelder, H. Bischof","doi":"10.1109/AVSS.2007.4425343","DOIUrl":null,"url":null,"abstract":"People tracking is of fundamental importance in multi-camera surveillance systems. In recent years, many approaches for multi-camera tracking have been discussed. Most methods use either various image features or the geometric relation between the cameras or both as a cue. It is a desire to know the geometry for distant cameras, because geometry is not influenced by, for example, drastic changes in object appearance or in scene illumination. However, the determination of the camera geometry is cumbersome. The paper tries to solve this problem and contributes in two different ways. On the one hand, an approach is presented that calibrates two distant cameras automatically. We continue previous work and focus especially on the calibration of the extrinsic parameters. Point correspondences are used for this task which are acquired by detecting points on top of people's heads. On the other hand, qualitative experimental results with the PETS 2006 benchmark data show that the self-calibration is accurate enough for a solely geometric tracking of people across distant cameras. Reliable features for a matching are hardly available in such cases.","PeriodicalId":371050,"journal":{"name":"2007 IEEE Conference on Advanced Video and Signal Based Surveillance","volume":"107 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-09-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"34","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 IEEE Conference on Advanced Video and Signal Based Surveillance","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AVSS.2007.4425343","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 34

Abstract

People tracking is of fundamental importance in multi-camera surveillance systems. In recent years, many approaches for multi-camera tracking have been discussed. Most methods use either various image features or the geometric relation between the cameras or both as a cue. It is a desire to know the geometry for distant cameras, because geometry is not influenced by, for example, drastic changes in object appearance or in scene illumination. However, the determination of the camera geometry is cumbersome. The paper tries to solve this problem and contributes in two different ways. On the one hand, an approach is presented that calibrates two distant cameras automatically. We continue previous work and focus especially on the calibration of the extrinsic parameters. Point correspondences are used for this task which are acquired by detecting points on top of people's heads. On the other hand, qualitative experimental results with the PETS 2006 benchmark data show that the self-calibration is accurate enough for a solely geometric tracking of people across distant cameras. Reliable features for a matching are hardly available in such cases.
人们通过两个远距离的自校准摄像机进行跟踪
在多摄像机监控系统中,人员跟踪是至关重要的。近年来,人们讨论了多种多摄像机跟踪方法。大多数方法要么使用各种图像特征,要么使用相机之间的几何关系,要么同时使用两者作为线索。人们渴望知道远处相机的几何形状,因为几何形状不受物体外观或场景照明的剧烈变化等因素的影响。然而,相机几何形状的确定是很麻烦的。本文试图解决这一问题,并从两个不同的方面作出贡献。一方面,提出了一种自动标定两台远距摄像机的方法。我们继续以前的工作,并特别关注外部参数的校准。这个任务使用点对应,这是通过检测人们头顶上的点来获得的。另一方面,使用PETS 2006基准数据的定性实验结果表明,自校准足够精确,可以对远距离摄像机中的人进行单独的几何跟踪。在这种情况下,很难获得可靠的匹配特性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信