K.K.T Punsara, H. Premachandra, A.W.A.D Chanaka, R.V Wijayawickrama, Abhayasinghe Nimsiri, Silva Rajitha de
{"title":"基于物联网的手语识别系统","authors":"K.K.T Punsara, H. Premachandra, A.W.A.D Chanaka, R.V Wijayawickrama, Abhayasinghe Nimsiri, Silva Rajitha de","doi":"10.1109/ICAC51239.2020.9357267","DOIUrl":null,"url":null,"abstract":"Sign language is the key communication medium, which deaf and mute people use in their day - to-day life. Talking to disabled people will cause a difficult situation since a non-mute person cannot understand their hand gestures and in many instances, mute people are hearing impaired. Same as Sinhala, Tamil, English, or any other language, sign language also tend to have differences according to the region. This paper is an attempt to assist deaf and mute people to develop an effective communication mechanism with non-mute people. The end product of this project is a combination of a mobile application that can translate the sign language into digital voice and loT-enabled, light-weighted wearable glove, which capable of recognizing twenty-six English alphabet, digits, and words. Better user experience provides with voice-to-text feature in mobile application to reduce the communication gap within mute and non-mute communities. Research findings and results from the current system visualize the output of the product can be optimized up to 25 % –35 % with an enhanced pattern recognition mechanism.","PeriodicalId":253040,"journal":{"name":"2020 2nd International Conference on Advancements in Computing (ICAC)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-12-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"IoT Based Sign Language Recognition System\",\"authors\":\"K.K.T Punsara, H. Premachandra, A.W.A.D Chanaka, R.V Wijayawickrama, Abhayasinghe Nimsiri, Silva Rajitha de\",\"doi\":\"10.1109/ICAC51239.2020.9357267\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sign language is the key communication medium, which deaf and mute people use in their day - to-day life. Talking to disabled people will cause a difficult situation since a non-mute person cannot understand their hand gestures and in many instances, mute people are hearing impaired. Same as Sinhala, Tamil, English, or any other language, sign language also tend to have differences according to the region. This paper is an attempt to assist deaf and mute people to develop an effective communication mechanism with non-mute people. The end product of this project is a combination of a mobile application that can translate the sign language into digital voice and loT-enabled, light-weighted wearable glove, which capable of recognizing twenty-six English alphabet, digits, and words. Better user experience provides with voice-to-text feature in mobile application to reduce the communication gap within mute and non-mute communities. Research findings and results from the current system visualize the output of the product can be optimized up to 25 % –35 % with an enhanced pattern recognition mechanism.\",\"PeriodicalId\":253040,\"journal\":{\"name\":\"2020 2nd International Conference on Advancements in Computing (ICAC)\",\"volume\":\"9 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-12-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 2nd International Conference on Advancements in Computing (ICAC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICAC51239.2020.9357267\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd International Conference on Advancements in Computing (ICAC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAC51239.2020.9357267","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sign language is the key communication medium, which deaf and mute people use in their day - to-day life. Talking to disabled people will cause a difficult situation since a non-mute person cannot understand their hand gestures and in many instances, mute people are hearing impaired. Same as Sinhala, Tamil, English, or any other language, sign language also tend to have differences according to the region. This paper is an attempt to assist deaf and mute people to develop an effective communication mechanism with non-mute people. The end product of this project is a combination of a mobile application that can translate the sign language into digital voice and loT-enabled, light-weighted wearable glove, which capable of recognizing twenty-six English alphabet, digits, and words. Better user experience provides with voice-to-text feature in mobile application to reduce the communication gap within mute and non-mute communities. Research findings and results from the current system visualize the output of the product can be optimized up to 25 % –35 % with an enhanced pattern recognition mechanism.