Human–Computer Interactive Gesture Feature Capture and Recognition in Virtual Reality

Fan Zhang
{"title":"Human–Computer Interactive Gesture Feature Capture and Recognition in Virtual Reality","authors":"Fan Zhang","doi":"10.1177/1064804620924133","DOIUrl":null,"url":null,"abstract":"With the development of computer technology, the simulation authenticity of virtual reality technology is getting higher and higher, and the accurate recognition of human–computer interaction gestures is also the key technology to enhance the authenticity of virtual reality. This article briefly introduced three different gesture feature extraction methods: scale invariant feature transform, local binary pattern and histogram of oriented gradients (HOG), and back-propagation (BP) neural network for classifying and recognizing different gestures. The gesture feature vectors obtained by three feature extraction methods were used as input data of BP neural network respectively and were simulated in MATLAB software. The results showed that the information of feature gesture diagram extracted by HOG was the closest to the original one; the BP neural network that applied HOG extracted feature vectors converged to stability faster and had the smallest error when it was stable; in the aspect of gesture recognition, the BP neural network that applied HOG extracted feature vector had higher accuracy and precision and lower false alarm rate.","PeriodicalId":357563,"journal":{"name":"Ergonomics in Design: The Quarterly of Human Factors Applications","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ergonomics in Design: The Quarterly of Human Factors Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1177/1064804620924133","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

With the development of computer technology, the simulation authenticity of virtual reality technology is getting higher and higher, and the accurate recognition of human–computer interaction gestures is also the key technology to enhance the authenticity of virtual reality. This article briefly introduced three different gesture feature extraction methods: scale invariant feature transform, local binary pattern and histogram of oriented gradients (HOG), and back-propagation (BP) neural network for classifying and recognizing different gestures. The gesture feature vectors obtained by three feature extraction methods were used as input data of BP neural network respectively and were simulated in MATLAB software. The results showed that the information of feature gesture diagram extracted by HOG was the closest to the original one; the BP neural network that applied HOG extracted feature vectors converged to stability faster and had the smallest error when it was stable; in the aspect of gesture recognition, the BP neural network that applied HOG extracted feature vector had higher accuracy and precision and lower false alarm rate.
虚拟现实中人机交互手势特征捕获与识别
随着计算机技术的发展,虚拟现实技术的仿真真实性要求越来越高,而对人机交互手势的准确识别也是增强虚拟现实真实性的关键技术。本文简要介绍了三种不同的手势特征提取方法:尺度不变特征变换、局部二值模式和定向梯度直方图(HOG)和反向传播(BP)神经网络对不同手势进行分类和识别。将三种特征提取方法得到的手势特征向量分别作为BP神经网络的输入数据,并在MATLAB软件中进行仿真。结果表明,HOG提取的特征手势图信息与原始特征手势图信息最接近;应用HOG提取的特征向量的BP神经网络收敛稳定较快,稳定时误差最小;在手势识别方面,应用HOG提取的特征向量的BP神经网络具有更高的准确率和精密度,虚警率更低。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信