Yang Song, Yu Gu, Peisen Wang, Yuanning Liu, A. Li
{"title":"A Kinect based gesture recognition algorithm using GMM and HMM","authors":"Yang Song, Yu Gu, Peisen Wang, Yuanning Liu, A. Li","doi":"10.1109/BMEI.2013.6747040","DOIUrl":null,"url":null,"abstract":"Gesture recognition is a quite promising field in robotics and many Human-Computer Interaction (HCI) related areas. This research uses Microsoft® Kinect to capture the 3D position data of joints, and uses Gaussian Mixture Model (GMM) and Hidden Markov Model (HMM) to model full-body gestures. We propose a gesture recognition algorithm to segment gestures from real-time data flow, and finally achieved to recognize predefined full-body gestures in real-time. This proposed method gives a high recognition rate of 94.36%, indicating the capability of the new method.","PeriodicalId":163211,"journal":{"name":"2013 6th International Conference on Biomedical Engineering and Informatics","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"27","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 6th International Conference on Biomedical Engineering and Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/BMEI.2013.6747040","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 27
Abstract
Gesture recognition is a quite promising field in robotics and many Human-Computer Interaction (HCI) related areas. This research uses Microsoft® Kinect to capture the 3D position data of joints, and uses Gaussian Mixture Model (GMM) and Hidden Markov Model (HMM) to model full-body gestures. We propose a gesture recognition algorithm to segment gestures from real-time data flow, and finally achieved to recognize predefined full-body gestures in real-time. This proposed method gives a high recognition rate of 94.36%, indicating the capability of the new method.