Aniket Kumar, M. Madaan, Shubham Kumar, Aniket Saha, Suman Yadav
{"title":"Indian Sign Language Gesture Recognition in Real-Time using Convolutional Neural Networks","authors":"Aniket Kumar, M. Madaan, Shubham Kumar, Aniket Saha, Suman Yadav","doi":"10.1109/SPIN52536.2021.9566005","DOIUrl":null,"url":null,"abstract":"Communication is a basic requirement of an individual to exchange feelings, thoughts, and ideas, but the hearing and speech impaired community finds it difficult to interact with the vast majority of people. Sign language facilitates communication between the hearing and speech impaired person and the rest of society. The Rights of Persons with Disabilities (RPWD) Act, 2016, was also passed by the Indian government, which acknowledges Indian Sign Language (ISL) and mandates the use of sign language interpreters in all government-aided organizations and the public sector proceedings. Unfortunately, a large percentage of the Indian population is not familiar with the semantics of the gestures associated with ISL. To bridge this communication gap, this paper proposes a model to identify and classify Indian Sign Language gestures in real-time using Convolutional Neural Networks (CNN). The model has been developed using OpenCV and Keras implementation of CNNs and aims to classify 36 ISL gestures representing 0-9 numbers and A-Z alphabets by converting them to their text equivalents. The dataset created and used consists of 300 images for each gesture which were fed into the CNN model for training and testing purposes. The proposed model was successfully implemented and achieved 99.91% accuracy for the test images.","PeriodicalId":343177,"journal":{"name":"2021 8th International Conference on Signal Processing and Integrated Networks (SPIN)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 8th International Conference on Signal Processing and Integrated Networks (SPIN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPIN52536.2021.9566005","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Communication is a basic requirement of an individual to exchange feelings, thoughts, and ideas, but the hearing and speech impaired community finds it difficult to interact with the vast majority of people. Sign language facilitates communication between the hearing and speech impaired person and the rest of society. The Rights of Persons with Disabilities (RPWD) Act, 2016, was also passed by the Indian government, which acknowledges Indian Sign Language (ISL) and mandates the use of sign language interpreters in all government-aided organizations and the public sector proceedings. Unfortunately, a large percentage of the Indian population is not familiar with the semantics of the gestures associated with ISL. To bridge this communication gap, this paper proposes a model to identify and classify Indian Sign Language gestures in real-time using Convolutional Neural Networks (CNN). The model has been developed using OpenCV and Keras implementation of CNNs and aims to classify 36 ISL gestures representing 0-9 numbers and A-Z alphabets by converting them to their text equivalents. The dataset created and used consists of 300 images for each gesture which were fed into the CNN model for training and testing purposes. The proposed model was successfully implemented and achieved 99.91% accuracy for the test images.