AggreGait: Automatic gait feature extraction for human age and gender classification with possible occlusion

IF 2.3 Q2 COMPUTER SCIENCE, THEORY & METHODS
Array Pub Date : 2025-03-05 DOI:10.1016/j.array.2025.100379
Timilehin B. Aderinola , Tee Connie , Thian Song Ong , Andrew Beng Jin Teoh , Michael Kah Ong Goh
{"title":"AggreGait: Automatic gait feature extraction for human age and gender classification with possible occlusion","authors":"Timilehin B. Aderinola ,&nbsp;Tee Connie ,&nbsp;Thian Song Ong ,&nbsp;Andrew Beng Jin Teoh ,&nbsp;Michael Kah Ong Goh","doi":"10.1016/j.array.2025.100379","DOIUrl":null,"url":null,"abstract":"<div><div>The growing interest in smart surveillance and automated public access control necessitates robust age and gender classification (AGC) techniques that can operate effectively in unconstrained environments. While model-based gait obtained via pose estimation offers a promising approach, its performance can be hindered by occlusions commonly encountered in real-world videos. In this work, we propose a custom Graph Neural Network (GNN) architecture, AggreGait, for robust AGC under occlusions. AggreGait integrates upper and lower body features with whole-body information for age and gender prediction. We train AggreGait on pose sequences from the gait-in-the-wild (GITW) dataset, simulating different types of occlusions. AggreGait performs comparably to existing methods, achieving an overall accuracy of 91% in unobstructed conditions. Notably, AggreGait maintains reasonable accuracy using only upper limb (or upper and lower limb) features, suggesting its potential for real-time surveillance applications despite occlusions. This work paves the way for practical gait-based AGC in unconstrained environments, enhancing the effectiveness of surveillance systems and facilitating automated access control.</div></div>","PeriodicalId":8417,"journal":{"name":"Array","volume":"26 ","pages":"Article 100379"},"PeriodicalIF":2.3000,"publicationDate":"2025-03-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Array","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2590005625000062","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0

Abstract

The growing interest in smart surveillance and automated public access control necessitates robust age and gender classification (AGC) techniques that can operate effectively in unconstrained environments. While model-based gait obtained via pose estimation offers a promising approach, its performance can be hindered by occlusions commonly encountered in real-world videos. In this work, we propose a custom Graph Neural Network (GNN) architecture, AggreGait, for robust AGC under occlusions. AggreGait integrates upper and lower body features with whole-body information for age and gender prediction. We train AggreGait on pose sequences from the gait-in-the-wild (GITW) dataset, simulating different types of occlusions. AggreGait performs comparably to existing methods, achieving an overall accuracy of 91% in unobstructed conditions. Notably, AggreGait maintains reasonable accuracy using only upper limb (or upper and lower limb) features, suggesting its potential for real-time surveillance applications despite occlusions. This work paves the way for practical gait-based AGC in unconstrained environments, enhancing the effectiveness of surveillance systems and facilitating automated access control.

Abstract Image

AggreGait:自动步态特征提取,用于可能闭塞的人类年龄和性别分类
对智能监控和自动化公共访问控制日益增长的兴趣需要强大的年龄和性别分类(AGC)技术,这些技术可以在不受约束的环境中有效运行。虽然通过姿态估计获得的基于模型的步态提供了一种很有前途的方法,但它的性能可能会受到现实世界视频中常见的遮挡的阻碍。在这项工作中,我们提出了一个自定义的图神经网络(GNN)架构AggreGait,用于遮挡下的鲁棒AGC。AggreGait将上半身和下半身的特征与全身信息相结合,用于预测年龄和性别。我们在来自野外步态(GITW)数据集的姿态序列上训练aggre步态,模拟不同类型的遮挡。AggreGait的性能与现有方法相当,在无障碍条件下的总体准确率达到91%。值得注意的是,AggreGait仅使用上肢(或上肢和下肢)特征就能保持合理的准确性,这表明它具有在闭塞情况下进行实时监测的潜力。这项工作为在无约束环境中实现基于步态的AGC铺平了道路,提高了监视系统的有效性,并促进了自动访问控制。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Array
Array Computer Science-General Computer Science
CiteScore
4.40
自引率
0.00%
发文量
93
审稿时长
45 days
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信