Human gait recognition using dense residual network and hybrid attention technique with back-flow mechanism

IF 2.9 3区 工程技术 Q2 ENGINEERING, ELECTRICAL & ELECTRONIC
Mohammad Iman Junaid, Sandeep Madarapu, Samit Ari
{"title":"Human gait recognition using dense residual network and hybrid attention technique with back-flow mechanism","authors":"Mohammad Iman Junaid,&nbsp;Sandeep Madarapu,&nbsp;Samit Ari","doi":"10.1016/j.dsp.2025.105401","DOIUrl":null,"url":null,"abstract":"<div><div>Gait recognition is a promising biometric technique for person identification, either as a standalone method or in combination with other modalities. A major challenge lies in extracting robust gait features from silhouettes that remain invariant to variation in clothing, carried objects, and camera viewpoints. Recent advances using attention-based convolutional neural networks (CNNs) have improved gait recognition performance; however, many existing methods struggle to preserve semantic information across network layers due to information loss during the stages of downsampling. To address this issue, a novel residual dense back-flow attention network (RDBA-Net) is proposed, which integrates dual-branch hybrid self-attention network (DHSAN) modules with densely connected residual dense blocks (RDBs), and the output features are concatenated in a back-flow direction. This design enables effective learning of discriminative gait features by leveraging attention cues at both spatial-level and temporal-level from silhouette sequences. Furthermore, back-flow mechanism enhances feature learning in earlier layers by reusing refined semantic information from deeper layers. Experimental evaluations on two benchmark datasets, CASIA B and OU-MVLP, demonstrate that RDBA-Net, achieves notable improvements in accuracy compared to existing state-of-the-art methods, with gains up to 91.6% on CASIA B and 89.2% on OU-MVLP under challenging conditions.</div></div>","PeriodicalId":51011,"journal":{"name":"Digital Signal Processing","volume":"166 ","pages":"Article 105401"},"PeriodicalIF":2.9000,"publicationDate":"2025-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Digital Signal Processing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1051200425004233","RegionNum":3,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Gait recognition is a promising biometric technique for person identification, either as a standalone method or in combination with other modalities. A major challenge lies in extracting robust gait features from silhouettes that remain invariant to variation in clothing, carried objects, and camera viewpoints. Recent advances using attention-based convolutional neural networks (CNNs) have improved gait recognition performance; however, many existing methods struggle to preserve semantic information across network layers due to information loss during the stages of downsampling. To address this issue, a novel residual dense back-flow attention network (RDBA-Net) is proposed, which integrates dual-branch hybrid self-attention network (DHSAN) modules with densely connected residual dense blocks (RDBs), and the output features are concatenated in a back-flow direction. This design enables effective learning of discriminative gait features by leveraging attention cues at both spatial-level and temporal-level from silhouette sequences. Furthermore, back-flow mechanism enhances feature learning in earlier layers by reusing refined semantic information from deeper layers. Experimental evaluations on two benchmark datasets, CASIA B and OU-MVLP, demonstrate that RDBA-Net, achieves notable improvements in accuracy compared to existing state-of-the-art methods, with gains up to 91.6% on CASIA B and 89.2% on OU-MVLP under challenging conditions.
基于密集残差网络和具有回流机制的混合注意技术的人体步态识别
步态识别是一种很有前途的生物识别技术,无论是作为一种独立的方法还是与其他模式相结合。一个主要的挑战在于从轮廓中提取稳健的步态特征,这些特征与服装、携带的物体和摄像机视点的变化保持不变。使用基于注意力的卷积神经网络(cnn)的最新进展提高了步态识别性能;然而,许多现有的方法由于在降采样阶段的信息丢失而难以跨网络层保留语义信息。针对这一问题,提出了一种新的残差密集回流关注网络(rgba - net),该网络将双分支混合自关注网络(DHSAN)模块与残差密集块(rdb)紧密连接,并将输出特征按回流方向连接。该设计通过利用来自轮廓序列的空间水平和时间水平的注意线索,能够有效地学习鉴别步态特征。此外,回流机制通过重用来自更深层的精炼语义信息来增强早期层的特征学习。在CASIA B和OU-MVLP两个基准数据集上的实验评估表明,RDBA-Net与现有最先进的方法相比,在具有挑战性的条件下,CASIA B和OU-MVLP的准确率分别提高了91.6%和89.2%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Digital Signal Processing
Digital Signal Processing 工程技术-工程:电子与电气
CiteScore
5.30
自引率
17.20%
发文量
435
审稿时长
66 days
期刊介绍: Digital Signal Processing: A Review Journal is one of the oldest and most established journals in the field of signal processing yet it aims to be the most innovative. The Journal invites top quality research articles at the frontiers of research in all aspects of signal processing. Our objective is to provide a platform for the publication of ground-breaking research in signal processing with both academic and industrial appeal. The journal has a special emphasis on statistical signal processing methodology such as Bayesian signal processing, and encourages articles on emerging applications of signal processing such as: • big data• machine learning• internet of things• information security• systems biology and computational biology,• financial time series analysis,• autonomous vehicles,• quantum computing,• neuromorphic engineering,• human-computer interaction and intelligent user interfaces,• environmental signal processing,• geophysical signal processing including seismic signal processing,• chemioinformatics and bioinformatics,• audio, visual and performance arts,• disaster management and prevention,• renewable energy,
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信