Counting pedestrians in crowded scenes with efficient sparse learning

M. Shimosaka, S. Masuda, R. Fukui, Taketoshi Mori, Tomomasa Sato
{"title":"Counting pedestrians in crowded scenes with efficient sparse learning","authors":"M. Shimosaka, S. Masuda, R. Fukui, Taketoshi Mori, Tomomasa Sato","doi":"10.1109/ACPR.2011.6166650","DOIUrl":null,"url":null,"abstract":"Counting pedestrians in crowded scenes provides powerful cues for several applications such as traffic, safety, and advertising analysis in urban areas. Recent research progress has shown that direct mapping from image statistics (e.g. area or texture histograms of people regions) to the number of pedestrians, also known as counting by regression, is a promise way of robust pedestrian counting. While leveraging arbitrary image features is encouraged in the counting by regression to improve the accuracy, this leads to risk of over-fitting issue. Furthermore, the most image statistics are sensitive to the way of foreground region segmentation. Hence, careful selection process on both segmentation and feature levels is needed. This paper presents an efficient sparse training method via LARS (Least Angle Regression) to achieve the selection process on both levels, which provides the both sparsity of Lasso and Group Lasso. The experimental results using synthetic and pedestrian counting dataset show that our method provides robust performance with reasonable training cost among the state of the art pedestrian counting methods.","PeriodicalId":287232,"journal":{"name":"The First Asian Conference on Pattern Recognition","volume":"19 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"The First Asian Conference on Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACPR.2011.6166650","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 8

Abstract

Counting pedestrians in crowded scenes provides powerful cues for several applications such as traffic, safety, and advertising analysis in urban areas. Recent research progress has shown that direct mapping from image statistics (e.g. area or texture histograms of people regions) to the number of pedestrians, also known as counting by regression, is a promise way of robust pedestrian counting. While leveraging arbitrary image features is encouraged in the counting by regression to improve the accuracy, this leads to risk of over-fitting issue. Furthermore, the most image statistics are sensitive to the way of foreground region segmentation. Hence, careful selection process on both segmentation and feature levels is needed. This paper presents an efficient sparse training method via LARS (Least Angle Regression) to achieve the selection process on both levels, which provides the both sparsity of Lasso and Group Lasso. The experimental results using synthetic and pedestrian counting dataset show that our method provides robust performance with reasonable training cost among the state of the art pedestrian counting methods.
基于高效稀疏学习的拥挤场景行人计数
在拥挤的场景中计算行人数量为城市地区的交通、安全和广告分析等多种应用提供了强大的线索。最近的研究进展表明,将图像统计(如人区域的面积或纹理直方图)直接映射到行人数量,也称为回归计数,是一种很有前途的稳健行人计数方法。虽然在回归计数中鼓励利用任意图像特征来提高准确性,但这会导致过度拟合问题的风险。此外,大多数图像统计量对前景区域分割的方式比较敏感。因此,需要在分割和特征级别上进行仔细的选择过程。本文提出了一种基于最小角度回归(LARS)的高效稀疏训练方法来实现两个层次的选择过程,同时提供了Lasso和Group Lasso的稀疏性。实验结果表明,该方法在现有行人计数方法中具有较好的鲁棒性和较好的训练成本。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信