Clothing-invariant gait recognition using convolutional neural network

Tze-Wei Yeoh, H. Aguirre, Kiyoshi Tanaka
{"title":"Clothing-invariant gait recognition using convolutional neural network","authors":"Tze-Wei Yeoh, H. Aguirre, Kiyoshi Tanaka","doi":"10.1109/ISPACS.2016.7824728","DOIUrl":null,"url":null,"abstract":"Gait recognition is recognizing human through the style in which they walk. However, the recognition task can become complicated due to the existence of covariate factors (e.g. clothing, camera viewpoint, carrying condition, elapsed time, walking surface, etc). Amongst all the covariate factors, clothing is the most challenging one. This is because it may obscure a significant amount of discriminative human gait features and makes it much more challenging for human recognition task. In recent, there has been significant research on this problem. However, conventional state-of-the-art methods have mostly use hand-crafted features for representing the human gait. In this work, we explore and study the use of convolutional neural networks (CNN) to automatically learn gait features or representations directly from low-level input raw data (i.e. Gait Energy Image (GEI)). Evaluations on the challenging clothing-invariant gait recognition of OU-ISIR Treadmill dataset B, the experiment results shows that our method can achieve far better performance as compared to hand-crafted feature in conventional state-of-the-art methods with minimal preprocessing knowledge of the problem are required.","PeriodicalId":131543,"journal":{"name":"2016 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"35","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Symposium on Intelligent Signal Processing and Communication Systems (ISPACS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISPACS.2016.7824728","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 35

Abstract

Gait recognition is recognizing human through the style in which they walk. However, the recognition task can become complicated due to the existence of covariate factors (e.g. clothing, camera viewpoint, carrying condition, elapsed time, walking surface, etc). Amongst all the covariate factors, clothing is the most challenging one. This is because it may obscure a significant amount of discriminative human gait features and makes it much more challenging for human recognition task. In recent, there has been significant research on this problem. However, conventional state-of-the-art methods have mostly use hand-crafted features for representing the human gait. In this work, we explore and study the use of convolutional neural networks (CNN) to automatically learn gait features or representations directly from low-level input raw data (i.e. Gait Energy Image (GEI)). Evaluations on the challenging clothing-invariant gait recognition of OU-ISIR Treadmill dataset B, the experiment results shows that our method can achieve far better performance as compared to hand-crafted feature in conventional state-of-the-art methods with minimal preprocessing knowledge of the problem are required.
基于卷积神经网络的服装不变性步态识别
步态识别是通过人的走路方式来识别人。然而,由于协变量因素(如服装、相机视点、携带条件、经过时间、行走表面等)的存在,识别任务会变得复杂。在所有协变量因素中,服装是最具挑战性的一个。这是因为它可能模糊了大量的人类鉴别步态特征,使人类识别任务更具挑战性。近年来,人们对这一问题进行了大量的研究。然而,传统的最先进的方法大多使用手工制作的特征来表示人类的步态。在这项工作中,我们探索和研究了使用卷积神经网络(CNN)直接从低级输入原始数据(即步态能量图像(GEI))中自动学习步态特征或表示。通过对u - isir跑步机数据集B的具有挑战性的衣服不变步态识别进行评估,实验结果表明,与传统的最先进方法中的手工特征相比,我们的方法可以获得更好的性能,并且需要最少的问题预处理知识。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信