{"title":"Sign Language to Text for Deaf and Dumb","authors":"Vibhu Gupta, Mansi Jain, Garima Aggarwal","doi":"10.1109/confluence52989.2022.9734196","DOIUrl":null,"url":null,"abstract":"Hand-Sign Language Gestures are nonverbal messages that help in communication and can be understood with vision. Since the only disability Deaf & Dumb people have is communication related to speech therefore they cannot use spoken languages hence the only way for them to communicate with the people having same disability is through this hand-sign language. This causes a language barrier as normal people can not understand their hand sign language and vice a versa also since most of the people are not familiar with the same and interpreters are not very user-friendly they end up in a deadlock. Therefore, this paper proposes a CNN-based method for deciphering sign language and then converting it to text. In the proposed scheme of this paper the main focus is on fingerspelling and an additional feature of emotion recognition to support the interpretation with the 3rd component of sign language i.e non-manual features, a real-time solution for easy interpretation of sign language for normal human beings as well as Deaf & Dumb people using convolutional neural networks breaking the language barrier. The experimental results show that this paper achieved an accuracy score of 99.8% on testing data which is better than the majority of the recent research papers on American gesture-based communication","PeriodicalId":261941,"journal":{"name":"2022 12th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-01-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 12th International Conference on Cloud Computing, Data Science & Engineering (Confluence)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/confluence52989.2022.9734196","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Hand-Sign Language Gestures are nonverbal messages that help in communication and can be understood with vision. Since the only disability Deaf & Dumb people have is communication related to speech therefore they cannot use spoken languages hence the only way for them to communicate with the people having same disability is through this hand-sign language. This causes a language barrier as normal people can not understand their hand sign language and vice a versa also since most of the people are not familiar with the same and interpreters are not very user-friendly they end up in a deadlock. Therefore, this paper proposes a CNN-based method for deciphering sign language and then converting it to text. In the proposed scheme of this paper the main focus is on fingerspelling and an additional feature of emotion recognition to support the interpretation with the 3rd component of sign language i.e non-manual features, a real-time solution for easy interpretation of sign language for normal human beings as well as Deaf & Dumb people using convolutional neural networks breaking the language barrier. The experimental results show that this paper achieved an accuracy score of 99.8% on testing data which is better than the majority of the recent research papers on American gesture-based communication