Hand Segmentation With Dense Dilated U-Net and Structurally Incoherent Nonnegative Matrix Factorization-Based Gesture Recognition

IF 3.5 3区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Kankana Roy;Rajiv R. Sahay
{"title":"Hand Segmentation With Dense Dilated U-Net and Structurally Incoherent Nonnegative Matrix Factorization-Based Gesture Recognition","authors":"Kankana Roy;Rajiv R. Sahay","doi":"10.1109/THMS.2024.3390415","DOIUrl":null,"url":null,"abstract":"Robust segmentation of hands in a cluttered environment for hand gesture recognition has remained a challenge in computer vision. In this work, a two-stage gesture recognition framework is proposed. In the first stage, we segment hands using the proposed deep learning algorithm, and in the second stage, we use these segmented hands to classify gestures using a novel structurally incoherent nonnegative matrix factorization approach. We propose a new deep learning framework for hand segmentation called densely dilated U-Net. We exploit recently proposed dense blocks and dilated convolution layers in our work. To cope with the scarcity of labeled datasets we extend our densely dilated U-Net for semisupervised hand segmentation using hand bounding boxes as cues. We provide quantitative and qualitative evaluation of proposed hand segmentation model on several public hand segmentation datasets including EgoHands, GTEA, EYTH, EDSH, and HOF. Semisupervised segmentation results are also obtained on two hand detection datasets including VIVA and CVRR. As an extension of our work, we show semisupervised segmentation and gesture recognition results using segmented hands on NUS-II cluttered hand gesture dataset. To validate the efficiency of our semisupervised algorithm we evaluate it on OUHands dataset with real ground truth labels. For gesture classification, we propose a novel structurally incoherent nonnegative matrix factorization algorithm. We propose to use CNN features extracted from segmented images for nonnegative matrix factorization. Experimental results on NUS-II and OUHands datasets demonstrate that our two-stage approach for gesture recognition yields superior results.","PeriodicalId":48916,"journal":{"name":"IEEE Transactions on Human-Machine Systems","volume":null,"pages":null},"PeriodicalIF":3.5000,"publicationDate":"2024-03-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Human-Machine Systems","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10522620/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Robust segmentation of hands in a cluttered environment for hand gesture recognition has remained a challenge in computer vision. In this work, a two-stage gesture recognition framework is proposed. In the first stage, we segment hands using the proposed deep learning algorithm, and in the second stage, we use these segmented hands to classify gestures using a novel structurally incoherent nonnegative matrix factorization approach. We propose a new deep learning framework for hand segmentation called densely dilated U-Net. We exploit recently proposed dense blocks and dilated convolution layers in our work. To cope with the scarcity of labeled datasets we extend our densely dilated U-Net for semisupervised hand segmentation using hand bounding boxes as cues. We provide quantitative and qualitative evaluation of proposed hand segmentation model on several public hand segmentation datasets including EgoHands, GTEA, EYTH, EDSH, and HOF. Semisupervised segmentation results are also obtained on two hand detection datasets including VIVA and CVRR. As an extension of our work, we show semisupervised segmentation and gesture recognition results using segmented hands on NUS-II cluttered hand gesture dataset. To validate the efficiency of our semisupervised algorithm we evaluate it on OUHands dataset with real ground truth labels. For gesture classification, we propose a novel structurally incoherent nonnegative matrix factorization algorithm. We propose to use CNN features extracted from segmented images for nonnegative matrix factorization. Experimental results on NUS-II and OUHands datasets demonstrate that our two-stage approach for gesture recognition yields superior results.
基于手势识别的密集稀疏 U-Net 和结构不连贯非负矩阵因数分解的手部分割
在杂乱的环境中对手部进行可靠的分割以进行手势识别一直是计算机视觉领域的一项挑战。在这项工作中,我们提出了一个两阶段手势识别框架。在第一阶段,我们使用提出的深度学习算法分割手部;在第二阶段,我们使用新颖的结构不连贯非负矩阵因式分解方法,利用这些分割的手部对手势进行分类。我们提出了一种新的手部分割深度学习框架,称为密集扩张 U-Net。我们在工作中利用了最近提出的密集块和扩张卷积层。为了应对标注数据集稀缺的问题,我们将密集扩张 U-Net 扩展为使用手部边界框作为线索的半监督手部分割。我们在多个公共手部分割数据集(包括 EgoHands、GTEA、EYTH、EDSH 和 HOF)上对所提出的手部分割模型进行了定量和定性评估。此外,我们还在两个手部检测数据集(包括 VIVA 和 CVRR)上获得了半监督分割结果。作为工作的延伸,我们展示了在 NUS-II 杂乱手势数据集上使用分割手进行半监督分割和手势识别的结果。为了验证我们的半监督算法的效率,我们在 OUHands 数据集上使用真实地面标签对其进行了评估。对于手势分类,我们提出了一种新颖的结构不连贯非负矩阵因式分解算法。我们建议使用从分割图像中提取的 CNN 特征进行非负矩阵因式分解。在 NUS-II 和 OUHands 数据集上的实验结果表明,我们的两阶段手势识别方法产生了卓越的效果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Transactions on Human-Machine Systems
IEEE Transactions on Human-Machine Systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, CYBERNETICS
CiteScore
7.10
自引率
11.10%
发文量
136
期刊介绍: The scope of the IEEE Transactions on Human-Machine Systems includes the fields of human machine systems. It covers human systems and human organizational interactions including cognitive ergonomics, system test and evaluation, and human information processing concerns in systems and organizations.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信