{"title":"使用深度学习解释聋哑人手写拼写银行帮助台术语","authors":"Aditi Chavan, Jayshree Ghorpade-aher, Aakriti Bhat, Aniket Raj, Shubham Mishra","doi":"10.1109/punecon52575.2021.9686514","DOIUrl":null,"url":null,"abstract":"The hand sign language uses visual-manual modality to share a certain message. Specially abled people having hearing and speaking disabilities interact more naturally in hand sign language rather than verbal language. According to one of the Census study, 2.21% out of 121 crore population in India are ‘disabled’, out of which 19% are having a hearing disability and 7% are having speech disability. Since everyone cannot communicate in sign language as it is a lesser-known language, it often leads to communication gap. So, the automated Sign Language Interpreter (SLI) helps to meet this communication gap as a manual sign language translator is not a convenient option because of its privacy issues and lack of availability. This paper proposes an Indian Hand Sign Language Interpreter which operates upon a vision-based approach that uses Machine Learning and Deep Learning techniques to locate the hand gesture region accurately for extracting the features and finally interpreting the respective meaning. The experimentation for performance metrics such as accuracy and loss using various activation functions helped to analyzed the performance of the model. The system successfully identifies a number of hand spelled words and thus eases the communication among people.","PeriodicalId":154406,"journal":{"name":"2021 IEEE Pune Section International Conference (PuneCon)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Interpretation of Hand Spelled Banking Helpdesk Terms for Deaf and Dumb Using Deep Learning\",\"authors\":\"Aditi Chavan, Jayshree Ghorpade-aher, Aakriti Bhat, Aniket Raj, Shubham Mishra\",\"doi\":\"10.1109/punecon52575.2021.9686514\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The hand sign language uses visual-manual modality to share a certain message. Specially abled people having hearing and speaking disabilities interact more naturally in hand sign language rather than verbal language. According to one of the Census study, 2.21% out of 121 crore population in India are ‘disabled’, out of which 19% are having a hearing disability and 7% are having speech disability. Since everyone cannot communicate in sign language as it is a lesser-known language, it often leads to communication gap. So, the automated Sign Language Interpreter (SLI) helps to meet this communication gap as a manual sign language translator is not a convenient option because of its privacy issues and lack of availability. This paper proposes an Indian Hand Sign Language Interpreter which operates upon a vision-based approach that uses Machine Learning and Deep Learning techniques to locate the hand gesture region accurately for extracting the features and finally interpreting the respective meaning. The experimentation for performance metrics such as accuracy and loss using various activation functions helped to analyzed the performance of the model. The system successfully identifies a number of hand spelled words and thus eases the communication among people.\",\"PeriodicalId\":154406,\"journal\":{\"name\":\"2021 IEEE Pune Section International Conference (PuneCon)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE Pune Section International Conference (PuneCon)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/punecon52575.2021.9686514\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE Pune Section International Conference (PuneCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/punecon52575.2021.9686514","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Interpretation of Hand Spelled Banking Helpdesk Terms for Deaf and Dumb Using Deep Learning
The hand sign language uses visual-manual modality to share a certain message. Specially abled people having hearing and speaking disabilities interact more naturally in hand sign language rather than verbal language. According to one of the Census study, 2.21% out of 121 crore population in India are ‘disabled’, out of which 19% are having a hearing disability and 7% are having speech disability. Since everyone cannot communicate in sign language as it is a lesser-known language, it often leads to communication gap. So, the automated Sign Language Interpreter (SLI) helps to meet this communication gap as a manual sign language translator is not a convenient option because of its privacy issues and lack of availability. This paper proposes an Indian Hand Sign Language Interpreter which operates upon a vision-based approach that uses Machine Learning and Deep Learning techniques to locate the hand gesture region accurately for extracting the features and finally interpreting the respective meaning. The experimentation for performance metrics such as accuracy and loss using various activation functions helped to analyzed the performance of the model. The system successfully identifies a number of hand spelled words and thus eases the communication among people.