Gait Recognition Based on 3D Skeleton Data and Graph Convolutional Network

Mengge Mao, Yonghong Song
{"title":"Gait Recognition Based on 3D Skeleton Data and Graph Convolutional Network","authors":"Mengge Mao, Yonghong Song","doi":"10.1109/IJCB48548.2020.9304916","DOIUrl":null,"url":null,"abstract":"Gait recognition is a hot topic in the field of biometrics because of its unique advantages such as non-contact and long distance. The appearance-based gait recognition methods usually extract features from the silhouettes of human body, which are easy to be affected by factors such as clothing and carrying objects. Although the model-based methods can effectively reduce the influence of appearance factors, it has high computational complexity. Therefore, this paper proposes a gait recognition method based on the 3D skeleton data and graph convolutional network. The 3D skeleton data is robust to the change of view. In this paper, we extract 3D joint feature and 3D bone feature based on 3D skeleton data, design a dual graph convolutional network to extract corresponding gait features and fuse them at feature level. At the same time, we use a multi-loss strategy to combine center loss and softmax loss to optimize the network. Our method is evaluated on the dataset CASIA B. The experimental results show that the proposed method can achieve state-of-the-art performance, and it can effectively reduce the influence of view, clothing and other factors.","PeriodicalId":417270,"journal":{"name":"2020 IEEE International Joint Conference on Biometrics (IJCB)","volume":"44 7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Joint Conference on Biometrics (IJCB)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCB48548.2020.9304916","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

Abstract

Gait recognition is a hot topic in the field of biometrics because of its unique advantages such as non-contact and long distance. The appearance-based gait recognition methods usually extract features from the silhouettes of human body, which are easy to be affected by factors such as clothing and carrying objects. Although the model-based methods can effectively reduce the influence of appearance factors, it has high computational complexity. Therefore, this paper proposes a gait recognition method based on the 3D skeleton data and graph convolutional network. The 3D skeleton data is robust to the change of view. In this paper, we extract 3D joint feature and 3D bone feature based on 3D skeleton data, design a dual graph convolutional network to extract corresponding gait features and fuse them at feature level. At the same time, we use a multi-loss strategy to combine center loss and softmax loss to optimize the network. Our method is evaluated on the dataset CASIA B. The experimental results show that the proposed method can achieve state-of-the-art performance, and it can effectively reduce the influence of view, clothing and other factors.
基于三维骨骼数据和图卷积网络的步态识别
步态识别以其非接触、远距离等独特的优点成为生物识别领域的研究热点。基于外观的步态识别方法通常是从人体轮廓中提取特征,而人体轮廓容易受到服装、携带物等因素的影响。虽然基于模型的方法可以有效地降低外观因素的影响,但其计算复杂度较高。因此,本文提出了一种基于三维骨骼数据和图卷积网络的步态识别方法。三维骨架数据对视角变化具有鲁棒性。本文基于三维骨骼数据提取三维关节特征和三维骨骼特征,设计双图卷积网络提取相应的步态特征,并在特征级进行融合。同时,我们采用多重损耗策略,将中心损耗和软最大损耗相结合,对网络进行优化。在CASIA b数据集上对我们的方法进行了评估,实验结果表明,所提出的方法可以达到最先进的性能,并且可以有效地降低视角、服装等因素的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信