Yona Falinie bte Abdul Gaus, Farrah Wong Hock Tze, K. T. T. Kin
{"title":"Feature extraction from 2D gesture trajectory in Malaysian Sign Language recognition","authors":"Yona Falinie bte Abdul Gaus, Farrah Wong Hock Tze, K. T. T. Kin","doi":"10.1109/ICOM.2011.5937179","DOIUrl":null,"url":null,"abstract":"In this paper, a method to identify hand gesture trajectory in constrained environment is introduced. The method consists of three modules: collection of input images, skin segmentation and feature extraction. To reduce processing time, we compare the absolute difference between two consecutive frames then choose which frames have the highest value. YCbCr colour space is selected as the skin model because it behaves in such a way that the illumination component is concentrated in a single component (Y) while the blue and red chrominance component is in Cb and Cr. The hand gestures trajectory is to be recognized by using two methods: template matching and division by shape. Template matching required the removal of the head of the signer, leaving with just 2 hands only. For division of shape, the gesture are grouped into 5 classifications of hand postures that is vertical, horizontal, 45° above, 45° below and overlapping with hands. A total of 43 frames were selected manually for each hand posture and analyzed to obtain the variation of hand gesture feature such as width, heights, angle and distance. Our experimental results show up to 80% of accuracy in identifying the forms of the gesture trajectory. It shows that the feature extraction method proposed in this paper is appropriate for defining particular gesture trajectory.","PeriodicalId":376337,"journal":{"name":"2011 4th International Conference on Mechatronics (ICOM)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 4th International Conference on Mechatronics (ICOM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOM.2011.5937179","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
In this paper, a method to identify hand gesture trajectory in constrained environment is introduced. The method consists of three modules: collection of input images, skin segmentation and feature extraction. To reduce processing time, we compare the absolute difference between two consecutive frames then choose which frames have the highest value. YCbCr colour space is selected as the skin model because it behaves in such a way that the illumination component is concentrated in a single component (Y) while the blue and red chrominance component is in Cb and Cr. The hand gestures trajectory is to be recognized by using two methods: template matching and division by shape. Template matching required the removal of the head of the signer, leaving with just 2 hands only. For division of shape, the gesture are grouped into 5 classifications of hand postures that is vertical, horizontal, 45° above, 45° below and overlapping with hands. A total of 43 frames were selected manually for each hand posture and analyzed to obtain the variation of hand gesture feature such as width, heights, angle and distance. Our experimental results show up to 80% of accuracy in identifying the forms of the gesture trajectory. It shows that the feature extraction method proposed in this paper is appropriate for defining particular gesture trajectory.