Image-Based Methods for Interaction with Head-Worn Worker-Assistance Systems

Frerk Saxen, Omer Rashid, A. Al-Hamadi, S. Adler, A. Kernchen, R. Mecke
{"title":"Image-Based Methods for Interaction with Head-Worn Worker-Assistance Systems","authors":"Frerk Saxen, Omer Rashid, A. Al-Hamadi, S. Adler, A. Kernchen, R. Mecke","doi":"10.4236/JILSA.2014.63011","DOIUrl":null,"url":null,"abstract":"In this paper, a \nmobile assistance-system is described which supports users in performing manual \nworking tasks in the context of assembling complex products. The assistance \nsystem contains a head-worn display for the visualization of information \nrelevant for the workflow as well as a video camera to acquire the scene. This \npaper is focused on the interaction of the user with this system and describes \nwork in progress and initial results from an industrial application scenario. \nWe present image-based methods for robust recognition of static and dynamic \nhand gestures in realtime. These methods are used for an intuitive \ninteraction with the assistance-system. The segmentation of the hand based on \ncolor information builds the basis of feature extraction for static and dynamic \ngestures. For the static gestures, the activation of particular sensitive \nregions in the camera image by the user’s hand is used for interaction. An HMM \nclassifier is used to extract dynamic gestures depending on motion parameters \ndetermined based on the optical flow in the camera image.","PeriodicalId":69452,"journal":{"name":"智能学习系统与应用(英文)","volume":"33 1","pages":"141-152"},"PeriodicalIF":0.0000,"publicationDate":"2014-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"智能学习系统与应用(英文)","FirstCategoryId":"1093","ListUrlMain":"https://doi.org/10.4236/JILSA.2014.63011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In this paper, a mobile assistance-system is described which supports users in performing manual working tasks in the context of assembling complex products. The assistance system contains a head-worn display for the visualization of information relevant for the workflow as well as a video camera to acquire the scene. This paper is focused on the interaction of the user with this system and describes work in progress and initial results from an industrial application scenario. We present image-based methods for robust recognition of static and dynamic hand gestures in realtime. These methods are used for an intuitive interaction with the assistance-system. The segmentation of the hand based on color information builds the basis of feature extraction for static and dynamic gestures. For the static gestures, the activation of particular sensitive regions in the camera image by the user’s hand is used for interaction. An HMM classifier is used to extract dynamic gestures depending on motion parameters determined based on the optical flow in the camera image.
与头戴式工人辅助系统交互的基于图像的方法
本文描述了一种移动辅助系统,该系统支持用户在组装复杂产品的背景下执行手动工作任务。辅助系统包括一个头戴式显示器,用于可视化与工作流程相关的信息,以及一个摄像机来获取场景。本文的重点是用户与该系统的交互,并描述了正在进行的工作和来自工业应用场景的初步结果。我们提出了基于图像的方法来实时鲁棒识别静态和动态手势。这些方法用于与辅助系统的直观交互。基于颜色信息的手部分割为静态和动态手势的特征提取奠定了基础。对于静态手势,通过用户的手激活相机图像中的特定敏感区域来进行交互。使用HMM分类器根据相机图像中的光流确定的运动参数提取动态手势。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
135
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信