Optical Flow guided Motion Template for Hand Gesture Recognition

Debajit Sarma, M. Bhuyan
{"title":"Optical Flow guided Motion Template for Hand Gesture Recognition","authors":"Debajit Sarma, M. Bhuyan","doi":"10.1109/ASPCON49795.2020.9276654","DOIUrl":null,"url":null,"abstract":"Gesture representation specially hand gesture has a special role in the computer and human interaction community. Model-based and appearance-based methods are two primary techniques for hand gesture representation. Apart from these two, space-time features and motion-based approaches have gained quite impressive performance in various applications of action and gesture recognition. In space-time features, actions/gestures are considered as local spatiotemporal neighbourhood. But most space-time features are computationally expensive. Motion-based approaches mainly constitute optical flow and motion templates. Motion estimation of the image pixels is the key factor in optical flow, whereas, in motion-templates, video-wide temporal evaluation and their representations are widely used for action/gesture recognition. Both these methods have their own advantages and accordingly applied in the analysis of motion and related applications. In this paper, we tried to combine both and proposed a new method to get the advantages of individual methods in representing the temporal templates of a video by fusing the video dynamics into a single image. The main benefits of the technique are basically its simplicity, ease of implementation, competitive performance and efficiency.","PeriodicalId":193814,"journal":{"name":"2020 IEEE Applied Signal Processing Conference (ASPCON)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Applied Signal Processing Conference (ASPCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ASPCON49795.2020.9276654","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

Gesture representation specially hand gesture has a special role in the computer and human interaction community. Model-based and appearance-based methods are two primary techniques for hand gesture representation. Apart from these two, space-time features and motion-based approaches have gained quite impressive performance in various applications of action and gesture recognition. In space-time features, actions/gestures are considered as local spatiotemporal neighbourhood. But most space-time features are computationally expensive. Motion-based approaches mainly constitute optical flow and motion templates. Motion estimation of the image pixels is the key factor in optical flow, whereas, in motion-templates, video-wide temporal evaluation and their representations are widely used for action/gesture recognition. Both these methods have their own advantages and accordingly applied in the analysis of motion and related applications. In this paper, we tried to combine both and proposed a new method to get the advantages of individual methods in representing the temporal templates of a video by fusing the video dynamics into a single image. The main benefits of the technique are basically its simplicity, ease of implementation, competitive performance and efficiency.
用于手势识别的光流引导运动模板
手势表示,特别是手势在计算机和人类交互领域具有特殊的作用。基于模型的方法和基于外观的方法是手势表示的两种主要技术。除此之外,时空特征和基于运动的方法在动作和手势识别的各种应用中也取得了令人印象深刻的表现。在时空特征中,动作/手势被认为是局部时空邻域。但是大多数时空特征在计算上是昂贵的。基于运动的方法主要包括光流和运动模板。图像像素的运动估计是光流的关键因素,而在运动模板中,视频范围的时间评估及其表示被广泛用于动作/手势识别。这两种方法都有各自的优点,可以应用于运动分析及相关应用。在本文中,我们尝试将两者结合起来,并提出了一种新的方法,通过将视频动态融合到单个图像中,从而获得单个方法在表示视频时间模板方面的优势。该技术的主要优点基本上是它的简单性,易于实现,具有竞争力的性能和效率。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信