res步态:真实场景步态数据集

Zihao Mu, F. M. Castro, M. Marín-Jiménez, Nicolás Guil Mata, Yan-Ran Li, Shiqi Yu
{"title":"res步态:真实场景步态数据集","authors":"Zihao Mu, F. M. Castro, M. Marín-Jiménez, Nicolás Guil Mata, Yan-Ran Li, Shiqi Yu","doi":"10.1109/IJCB52358.2021.9484347","DOIUrl":null,"url":null,"abstract":"Many studies have shown that gait recognition can be used to identify humans at a long distance, with promising results on current datasets. However, those datasets are collected under controlled situations and predefined conditions, which limits the extrapolation of the results to unconstrained situations in which the subjects walk freely in scenes. To cover this gap, we release a novel real-scene gait dataset (ReSGait), which is the first dataset collected in unconstrained scenarios with freely moving subjects and not controlled environmental parameters. Overall, our dataset is composed of 172 subjects and 870 video sequences, recorded over 15 months. Video sequences are labeled with gender, clothing, carrying conditions, taken walking route, and whether mobile phones were used or not. Therefore, the main characteristics of our dataset that differentiate it from other datasets are as follows: (i) uncontrolled real-life scenes and (ii) long recording time. Finally, we empirically assess the difficulty of the proposed dataset by evaluating state-of-the-art gait approaches for silhouette and pose modalities. The results reveal an accuracy of less than 35%, showing the inherent level of difficulty of our dataset compared to other current datasets, in which accuracies are higher than 90%. Thus, our proposed dataset establishes a new level of difficulty in the gait recognition problem, much closer to real life.","PeriodicalId":175984,"journal":{"name":"2021 IEEE International Joint Conference on Biometrics (IJCB)","volume":"138 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"ReSGait: The Real-Scene Gait Dataset\",\"authors\":\"Zihao Mu, F. M. Castro, M. Marín-Jiménez, Nicolás Guil Mata, Yan-Ran Li, Shiqi Yu\",\"doi\":\"10.1109/IJCB52358.2021.9484347\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Many studies have shown that gait recognition can be used to identify humans at a long distance, with promising results on current datasets. However, those datasets are collected under controlled situations and predefined conditions, which limits the extrapolation of the results to unconstrained situations in which the subjects walk freely in scenes. To cover this gap, we release a novel real-scene gait dataset (ReSGait), which is the first dataset collected in unconstrained scenarios with freely moving subjects and not controlled environmental parameters. Overall, our dataset is composed of 172 subjects and 870 video sequences, recorded over 15 months. Video sequences are labeled with gender, clothing, carrying conditions, taken walking route, and whether mobile phones were used or not. Therefore, the main characteristics of our dataset that differentiate it from other datasets are as follows: (i) uncontrolled real-life scenes and (ii) long recording time. Finally, we empirically assess the difficulty of the proposed dataset by evaluating state-of-the-art gait approaches for silhouette and pose modalities. The results reveal an accuracy of less than 35%, showing the inherent level of difficulty of our dataset compared to other current datasets, in which accuracies are higher than 90%. Thus, our proposed dataset establishes a new level of difficulty in the gait recognition problem, much closer to real life.\",\"PeriodicalId\":175984,\"journal\":{\"name\":\"2021 IEEE International Joint Conference on Biometrics (IJCB)\",\"volume\":\"138 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-08-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE International Joint Conference on Biometrics (IJCB)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCB52358.2021.9484347\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE International Joint Conference on Biometrics (IJCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCB52358.2021.9484347","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 10

摘要

许多研究表明,步态识别可以用于远距离识别人类,并在当前数据集上取得了可喜的结果。然而,这些数据集是在受控的情况和预定义的条件下收集的,这限制了结果的外推到受试者在场景中自由行走的无约束情况。为了弥补这一差距,我们发布了一个新的真实场景步态数据集(ReSGait),这是第一个在无约束的场景中收集的数据集,其中包括自由移动的受试者和不受控制的环境参数。总体而言,我们的数据集由172名受试者和870个视频序列组成,记录时间超过15个月。视频序列被标记为性别,服装,携带条件,行走路线,是否使用手机。因此,我们的数据集区别于其他数据集的主要特征如下:(i)不受控制的真实场景和(ii)长记录时间。最后,我们通过评估轮廓和姿态模式的最先进的步态方法来经验评估所提出数据集的难度。结果显示,我们的数据集的准确率低于35%,这表明与其他当前数据集相比,我们的数据集固有的难度水平,其中准确率高于90%。因此,我们提出的数据集在步态识别问题中建立了一个新的难度水平,更接近现实生活。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
ReSGait: The Real-Scene Gait Dataset
Many studies have shown that gait recognition can be used to identify humans at a long distance, with promising results on current datasets. However, those datasets are collected under controlled situations and predefined conditions, which limits the extrapolation of the results to unconstrained situations in which the subjects walk freely in scenes. To cover this gap, we release a novel real-scene gait dataset (ReSGait), which is the first dataset collected in unconstrained scenarios with freely moving subjects and not controlled environmental parameters. Overall, our dataset is composed of 172 subjects and 870 video sequences, recorded over 15 months. Video sequences are labeled with gender, clothing, carrying conditions, taken walking route, and whether mobile phones were used or not. Therefore, the main characteristics of our dataset that differentiate it from other datasets are as follows: (i) uncontrolled real-life scenes and (ii) long recording time. Finally, we empirically assess the difficulty of the proposed dataset by evaluating state-of-the-art gait approaches for silhouette and pose modalities. The results reveal an accuracy of less than 35%, showing the inherent level of difficulty of our dataset compared to other current datasets, in which accuracies are higher than 90%. Thus, our proposed dataset establishes a new level of difficulty in the gait recognition problem, much closer to real life.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信