Stride and cadence as a biometric in automatic person identification and verification

Chiraz BenAbdelkader, L. Davis, Ross Cutler
{"title":"Stride and cadence as a biometric in automatic person identification and verification","authors":"Chiraz BenAbdelkader, L. Davis, Ross Cutler","doi":"10.1109/AFGR.2002.1004182","DOIUrl":null,"url":null,"abstract":"Presents a correspondence-free method to automatically estimate the spatio-temporal parameters of gait (stride length and cadence) of a walking person from video. Stride and cadence are functions of body height, weight and gender, and we use these biometrics for identification and verification of people. The cadence is estimated using the periodicity of a walking person. Using a calibrated camera system, the stride length is estimated by first tracking the person and estimating their distance travelled over a period of time. By counting the number of steps (again using periodicity) and assuming constant-velocity walking, we are able to estimate the stride to within 1 cm for a typical outdoor surveillance configuration (under certain assumptions). With a database of 17 people and eight samples of each, we show that a person is verified with an equal error rate (EER) of 11%, and correctly identified with a probability of 40%. This method works with low-resolution images of people and is robust to changes in lighting, clothing and tracking errors. It is view-invariant, though performance is optimal in a near-fronto-parallel configuration.","PeriodicalId":364299,"journal":{"name":"Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition","volume":"73 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2002-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"275","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of Fifth IEEE International Conference on Automatic Face Gesture Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AFGR.2002.1004182","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 275

Abstract

Presents a correspondence-free method to automatically estimate the spatio-temporal parameters of gait (stride length and cadence) of a walking person from video. Stride and cadence are functions of body height, weight and gender, and we use these biometrics for identification and verification of people. The cadence is estimated using the periodicity of a walking person. Using a calibrated camera system, the stride length is estimated by first tracking the person and estimating their distance travelled over a period of time. By counting the number of steps (again using periodicity) and assuming constant-velocity walking, we are able to estimate the stride to within 1 cm for a typical outdoor surveillance configuration (under certain assumptions). With a database of 17 people and eight samples of each, we show that a person is verified with an equal error rate (EER) of 11%, and correctly identified with a probability of 40%. This method works with low-resolution images of people and is robust to changes in lighting, clothing and tracking errors. It is view-invariant, though performance is optimal in a near-fronto-parallel configuration.
步幅和节奏作为生物特征在人的自动识别和验证
提出了一种无对应的方法,用于从视频中自动估计步幅和步幅的时空参数。步幅和节奏是身高、体重和性别的函数,我们使用这些生物特征来识别和验证人。节奏是用走路的人的周期来估计的。使用校准的摄像系统,步幅长度是通过首先跟踪人并估计他们在一段时间内走过的距离来估计的。通过计算步数(再次使用周期性)并假设等速行走,我们能够在典型的户外监视配置(在某些假设下)估计步幅在1厘米以内。使用一个包含17个人和每个人8个样本的数据库,我们表明,一个人的验证错误率(EER)为11%,正确识别的概率为40%。这种方法适用于低分辨率的人物图像,并且对光线、服装和跟踪错误的变化具有鲁棒性。它是视图不变的,尽管在接近前端并行的配置中性能是最优的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信