Frerk Saxen, Omer Rashid, A. Al-Hamadi, S. Adler, A. Kernchen, R. Mecke
{"title":"Image-Based Methods for Interaction with Head-Worn Worker-Assistance Systems","authors":"Frerk Saxen, Omer Rashid, A. Al-Hamadi, S. Adler, A. Kernchen, R. Mecke","doi":"10.4236/JILSA.2014.63011","DOIUrl":null,"url":null,"abstract":"In this paper, a \nmobile assistance-system is described which supports users in performing manual \nworking tasks in the context of assembling complex products. The assistance \nsystem contains a head-worn display for the visualization of information \nrelevant for the workflow as well as a video camera to acquire the scene. This \npaper is focused on the interaction of the user with this system and describes \nwork in progress and initial results from an industrial application scenario. \nWe present image-based methods for robust recognition of static and dynamic \nhand gestures in realtime. These methods are used for an intuitive \ninteraction with the assistance-system. The segmentation of the hand based on \ncolor information builds the basis of feature extraction for static and dynamic \ngestures. For the static gestures, the activation of particular sensitive \nregions in the camera image by the user’s hand is used for interaction. An HMM \nclassifier is used to extract dynamic gestures depending on motion parameters \ndetermined based on the optical flow in the camera image.","PeriodicalId":69452,"journal":{"name":"智能学习系统与应用(英文)","volume":"33 1","pages":"141-152"},"PeriodicalIF":0.0000,"publicationDate":"2014-08-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"智能学习系统与应用(英文)","FirstCategoryId":"1093","ListUrlMain":"https://doi.org/10.4236/JILSA.2014.63011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
In this paper, a
mobile assistance-system is described which supports users in performing manual
working tasks in the context of assembling complex products. The assistance
system contains a head-worn display for the visualization of information
relevant for the workflow as well as a video camera to acquire the scene. This
paper is focused on the interaction of the user with this system and describes
work in progress and initial results from an industrial application scenario.
We present image-based methods for robust recognition of static and dynamic
hand gestures in realtime. These methods are used for an intuitive
interaction with the assistance-system. The segmentation of the hand based on
color information builds the basis of feature extraction for static and dynamic
gestures. For the static gestures, the activation of particular sensitive
regions in the camera image by the user’s hand is used for interaction. An HMM
classifier is used to extract dynamic gestures depending on motion parameters
determined based on the optical flow in the camera image.