基于贪婪似然最大化的实时多人跟踪

Nyan Bo Bo, Francis Deboeverie, P. Veelaert, W. Philips
{"title":"基于贪婪似然最大化的实时多人跟踪","authors":"Nyan Bo Bo, Francis Deboeverie, P. Veelaert, W. Philips","doi":"10.1145/2789116.2789125","DOIUrl":null,"url":null,"abstract":"Unlike tracking rigid targets, the task of tracking multiple people is very challenging because the appearance and the shape of a person varies depending on the target's location and orientation. This paper presents a new approach to track multiple people with high accuracy using a calibrated monocular camera. Our approach recursively updates the positions of all persons based on the observed foreground image and previously known location of each person. This is done by maximizing the likelihood of observing the foreground image given the positions of all persons. Since the computational complexity of our approach is low, it is possible to run in real time on smart cameras. When a network of multiple smart cameras overseeing the scene is available, local position estimates from smart cameras can be fused to produced more accurate joint position estimates. The performance evaluation of our approach on very challenging video sequences from public datasets shows that our tracker achieves high accuracy. When comparing to other state-of-the-art tracking systems, our method outperforms in terms of Multiple Object Tracking Accuracy (MOTA).","PeriodicalId":113163,"journal":{"name":"Proceedings of the 9th International Conference on Distributed Smart Cameras","volume":"42 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"Real-time multi-people tracking by greedy likelihood maximization\",\"authors\":\"Nyan Bo Bo, Francis Deboeverie, P. Veelaert, W. Philips\",\"doi\":\"10.1145/2789116.2789125\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Unlike tracking rigid targets, the task of tracking multiple people is very challenging because the appearance and the shape of a person varies depending on the target's location and orientation. This paper presents a new approach to track multiple people with high accuracy using a calibrated monocular camera. Our approach recursively updates the positions of all persons based on the observed foreground image and previously known location of each person. This is done by maximizing the likelihood of observing the foreground image given the positions of all persons. Since the computational complexity of our approach is low, it is possible to run in real time on smart cameras. When a network of multiple smart cameras overseeing the scene is available, local position estimates from smart cameras can be fused to produced more accurate joint position estimates. The performance evaluation of our approach on very challenging video sequences from public datasets shows that our tracker achieves high accuracy. When comparing to other state-of-the-art tracking systems, our method outperforms in terms of Multiple Object Tracking Accuracy (MOTA).\",\"PeriodicalId\":113163,\"journal\":{\"name\":\"Proceedings of the 9th International Conference on Distributed Smart Cameras\",\"volume\":\"42 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2015-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 9th International Conference on Distributed Smart Cameras\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2789116.2789125\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 9th International Conference on Distributed Smart Cameras","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2789116.2789125","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

摘要

与跟踪刚性目标不同,跟踪多人的任务非常具有挑战性,因为一个人的外观和形状会随着目标的位置和方向而变化。本文提出了一种利用标定单目摄像机高精度跟踪多人的新方法。我们的方法基于观察到的前景图像和每个人之前已知的位置递归地更新所有人的位置。这是通过在给定所有人位置的情况下最大化观察前景图像的可能性来实现的。由于我们的方法的计算复杂度很低,因此可以在智能相机上实时运行。当多个智能摄像头组成的网络监控现场时,来自智能摄像头的局部位置估计可以融合在一起,从而产生更准确的联合位置估计。对来自公共数据集的极具挑战性的视频序列的性能评估表明,我们的跟踪器达到了很高的精度。与其他先进的跟踪系统相比,我们的方法在多目标跟踪精度(MOTA)方面表现出色。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Real-time multi-people tracking by greedy likelihood maximization
Unlike tracking rigid targets, the task of tracking multiple people is very challenging because the appearance and the shape of a person varies depending on the target's location and orientation. This paper presents a new approach to track multiple people with high accuracy using a calibrated monocular camera. Our approach recursively updates the positions of all persons based on the observed foreground image and previously known location of each person. This is done by maximizing the likelihood of observing the foreground image given the positions of all persons. Since the computational complexity of our approach is low, it is possible to run in real time on smart cameras. When a network of multiple smart cameras overseeing the scene is available, local position estimates from smart cameras can be fused to produced more accurate joint position estimates. The performance evaluation of our approach on very challenging video sequences from public datasets shows that our tracker achieves high accuracy. When comparing to other state-of-the-art tracking systems, our method outperforms in terms of Multiple Object Tracking Accuracy (MOTA).
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信