Jordi Sanchez-Riera, Yuan-Sheng Hsiao, Tekoing Lim, K. Hua, Wen-Huang Cheng
{"title":"A robust tracking algorithm for 3D hand gesture with rapid hand motion through deep learning","authors":"Jordi Sanchez-Riera, Yuan-Sheng Hsiao, Tekoing Lim, K. Hua, Wen-Huang Cheng","doi":"10.1109/ICMEW.2014.6890556","DOIUrl":null,"url":null,"abstract":"There are two main problems that make hand gesture tracking especially difficult. One is the great number of degrees of freedom of the hand and the other one is the rapid movements that we make in natural gestures. Algorithms based on minimizing an objective function, with a good initialization, typically obtain good accuracy at low frame rates. However, these methods are very dependent on the initialization point, and fast movements on the hand position or gesture, provokes a lost of track which are unable to recover. We present a method that uses deep learning to train a set of gestures (81 gestures), that will be used as a rough estimate of the hand pose and orientation. This will serve to a registration of non rigid model algorithm that will find the parameters of hand, even when temporal assumption of smooth movements of hands is violated. To evaluate our proposed algorithm, different experiments are performed with some real sequences recorded with Intel depth sensor to demonstrate the performance in a real scenario.","PeriodicalId":178700,"journal":{"name":"2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","volume":"117 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"16","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE International Conference on Multimedia and Expo Workshops (ICMEW)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMEW.2014.6890556","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 16
Abstract
There are two main problems that make hand gesture tracking especially difficult. One is the great number of degrees of freedom of the hand and the other one is the rapid movements that we make in natural gestures. Algorithms based on minimizing an objective function, with a good initialization, typically obtain good accuracy at low frame rates. However, these methods are very dependent on the initialization point, and fast movements on the hand position or gesture, provokes a lost of track which are unable to recover. We present a method that uses deep learning to train a set of gestures (81 gestures), that will be used as a rough estimate of the hand pose and orientation. This will serve to a registration of non rigid model algorithm that will find the parameters of hand, even when temporal assumption of smooth movements of hands is violated. To evaluate our proposed algorithm, different experiments are performed with some real sequences recorded with Intel depth sensor to demonstrate the performance in a real scenario.