Awang Karisma As’Ad Adi Asta, E. M. Yuniarno, S. M. S. Nugroho, Cries Avian
{"title":"Handwriting Classification based on Hand Movement using ConvLSTM","authors":"Awang Karisma As’Ad Adi Asta, E. M. Yuniarno, S. M. S. Nugroho, Cries Avian","doi":"10.1109/ISITIA59021.2023.10221037","DOIUrl":null,"url":null,"abstract":"The recognition of handwritten text presents challenges due to the variability and complexity of human handwriting, making it difficult to capture subtle nuances through traditional methods. Hand gesture recognition has emerged as an alternative method for predicting handwritten text, using sensors such as Kinect, LeapMotion, gyroscopes, accelerometers, and electromyograms to extract geometric and spatial information. Continuous hand-gesture recognition using cameras is preferred due to its ease of use and low hardware costs. Researchers have proposed different methods for recognizing hand gestures, including fuzzy logic, deterministic finite automata, trajectory-based methods, and dynamic probability long short-term memory (DP-LSTM). However, the latest research has shown that using LSTM can result in spatial information being lost. Therefore, this work proposes an architecture that captures spatial information using Convolutional Neural Network (CNN) and LSTM as Conv-LSTM, achieving high recognition rates in hand gesture trajectories for letters a to e in English captured using MediaPipe. Our results show that our proposed model can achieve high accuracy in classification and attained 0.8438.","PeriodicalId":116682,"journal":{"name":"2023 International Seminar on Intelligent Technology and Its Applications (ISITIA)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Seminar on Intelligent Technology and Its Applications (ISITIA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISITIA59021.2023.10221037","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The recognition of handwritten text presents challenges due to the variability and complexity of human handwriting, making it difficult to capture subtle nuances through traditional methods. Hand gesture recognition has emerged as an alternative method for predicting handwritten text, using sensors such as Kinect, LeapMotion, gyroscopes, accelerometers, and electromyograms to extract geometric and spatial information. Continuous hand-gesture recognition using cameras is preferred due to its ease of use and low hardware costs. Researchers have proposed different methods for recognizing hand gestures, including fuzzy logic, deterministic finite automata, trajectory-based methods, and dynamic probability long short-term memory (DP-LSTM). However, the latest research has shown that using LSTM can result in spatial information being lost. Therefore, this work proposes an architecture that captures spatial information using Convolutional Neural Network (CNN) and LSTM as Conv-LSTM, achieving high recognition rates in hand gesture trajectories for letters a to e in English captured using MediaPipe. Our results show that our proposed model can achieve high accuracy in classification and attained 0.8438.