{"title":"人机对话系统中手势识别的新方法","authors":"Pujan Ziaie, T. Müller, Alois Knoll","doi":"10.1109/IPTA.2008.4743760","DOIUrl":null,"url":null,"abstract":"In this paper, a reliable, fast and robust approach for static hand gesture recognition in the domain of a human-robot interaction system is presented. The method is based on computing the likelihood of different existing gesture-types and assigning a probability to every type by using Bayesian inference rules. For this purpose, two classes of geometrical invariants has been defined and the gesture likelihoods of both of the invariant-classes are estimated by means of a modified K-nearest neighbors classifier. One of the invariant-classes consists of the well-known Hu moments and the other one encompasses five defined geometrical attributes that are transformation, rotation and scale invariant, which are obtained from the outer-contour of a hand. Given the experimental results of this approach in the domain of the Joint-Action Science and Technology (JAST) project, it appears to have a very considerable performance of more than 95% correct classification results on average for three types of gestures (pointing, grasping and holding-out) under various lighting conditions and hand poses.","PeriodicalId":384072,"journal":{"name":"2008 First Workshops on Image Processing Theory, Tools and Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2008-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"20","resultStr":"{\"title\":\"A Novel Approach to Hand-Gesture Recognition in a Human-Robot Dialog System\",\"authors\":\"Pujan Ziaie, T. Müller, Alois Knoll\",\"doi\":\"10.1109/IPTA.2008.4743760\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, a reliable, fast and robust approach for static hand gesture recognition in the domain of a human-robot interaction system is presented. The method is based on computing the likelihood of different existing gesture-types and assigning a probability to every type by using Bayesian inference rules. For this purpose, two classes of geometrical invariants has been defined and the gesture likelihoods of both of the invariant-classes are estimated by means of a modified K-nearest neighbors classifier. One of the invariant-classes consists of the well-known Hu moments and the other one encompasses five defined geometrical attributes that are transformation, rotation and scale invariant, which are obtained from the outer-contour of a hand. Given the experimental results of this approach in the domain of the Joint-Action Science and Technology (JAST) project, it appears to have a very considerable performance of more than 95% correct classification results on average for three types of gestures (pointing, grasping and holding-out) under various lighting conditions and hand poses.\",\"PeriodicalId\":384072,\"journal\":{\"name\":\"2008 First Workshops on Image Processing Theory, Tools and Applications\",\"volume\":\"14 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2008-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"20\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2008 First Workshops on Image Processing Theory, Tools and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IPTA.2008.4743760\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2008 First Workshops on Image Processing Theory, Tools and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IPTA.2008.4743760","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Novel Approach to Hand-Gesture Recognition in a Human-Robot Dialog System
In this paper, a reliable, fast and robust approach for static hand gesture recognition in the domain of a human-robot interaction system is presented. The method is based on computing the likelihood of different existing gesture-types and assigning a probability to every type by using Bayesian inference rules. For this purpose, two classes of geometrical invariants has been defined and the gesture likelihoods of both of the invariant-classes are estimated by means of a modified K-nearest neighbors classifier. One of the invariant-classes consists of the well-known Hu moments and the other one encompasses five defined geometrical attributes that are transformation, rotation and scale invariant, which are obtained from the outer-contour of a hand. Given the experimental results of this approach in the domain of the Joint-Action Science and Technology (JAST) project, it appears to have a very considerable performance of more than 95% correct classification results on average for three types of gestures (pointing, grasping and holding-out) under various lighting conditions and hand poses.