Disha Gangadia, Varsha Chamaria, V. Doshi, Jigyasa Gandhi
{"title":"Indian Sign Language Interpretation and Sentence Formation","authors":"Disha Gangadia, Varsha Chamaria, V. Doshi, Jigyasa Gandhi","doi":"10.1109/PuneCon50868.2020.9362383","DOIUrl":null,"url":null,"abstract":"People with speech and hearing disabilities approximately constitute 1 percentage of the total Indian population. A person who is hearing and speech impaired is not able to compete or work with a normal person in a normal environment because of the lack of a proper communication medium.Sign Language is used for communication amongst them. Sign Language is the most natural and expressive way for the hearing and speech impaired. This paper proposes a method that recognizes Sign Language and converts it to normal text and speech for fast and improved communication amongst them and also with others. The focus is on the Indian Sign Language (ISL) specifically as there is no substantial work on ISL rendering the above requirements for these people.The paper focuses on developing a real-time hands-on system that takes video inputs of gestures in the specified ROI and performs gesture recognition using various feature extraction techniques and Hybrid-CNN model trained using the ISL database created. The correctly identified gesture tokens are sent to a Rule-Based Grammar and for Web Search query to generate various sentences and a Multi-Headed BERT grammar corrector provides grammatically precise and correct sentences as the final output.","PeriodicalId":368862,"journal":{"name":"2020 IEEE Pune Section International Conference (PuneCon)","volume":"726 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE Pune Section International Conference (PuneCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PuneCon50868.2020.9362383","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
People with speech and hearing disabilities approximately constitute 1 percentage of the total Indian population. A person who is hearing and speech impaired is not able to compete or work with a normal person in a normal environment because of the lack of a proper communication medium.Sign Language is used for communication amongst them. Sign Language is the most natural and expressive way for the hearing and speech impaired. This paper proposes a method that recognizes Sign Language and converts it to normal text and speech for fast and improved communication amongst them and also with others. The focus is on the Indian Sign Language (ISL) specifically as there is no substantial work on ISL rendering the above requirements for these people.The paper focuses on developing a real-time hands-on system that takes video inputs of gestures in the specified ROI and performs gesture recognition using various feature extraction techniques and Hybrid-CNN model trained using the ISL database created. The correctly identified gesture tokens are sent to a Rule-Based Grammar and for Web Search query to generate various sentences and a Multi-Headed BERT grammar corrector provides grammatically precise and correct sentences as the final output.