Mark Allen Cabutaje, Kenneth Ang Brondial, Alyssa Franchesca Obillo, Mideth B. Abisado, Shekinah Lor B. Huyo-a, G. Sampedro
{"title":"Ano Raw: A Deep Learning Based Approach to Transliterating the Filipino Sign Language","authors":"Mark Allen Cabutaje, Kenneth Ang Brondial, Alyssa Franchesca Obillo, Mideth B. Abisado, Shekinah Lor B. Huyo-a, G. Sampedro","doi":"10.1109/ICEIC57457.2023.10049890","DOIUrl":null,"url":null,"abstract":"Deaf people communicate best through sign language. It is one of the most vital languages worldwide. Sign languages, like spoken languages, are sophisticated, naturally formed structures that are arranged around a set of conversational activities. In the Philippines, practicing Filipino Sign Language (FSL) already had improved communication for deaf people. However, the community’s main difficulty is that most Filipinos do not comprehend or use FSL. Within the deaf community, these gestures, whether ASL or FSL, are still limited. People who are normal and hearing would never attempt to learn sign language. This results in a significant communication gap between the deaf and the hearing. Human translators are needed in order for deaf people to be understood. While they can be helpful, they are not always available or attainable. Filipino Sign Language Alphabet Recognition using Convolutional Neural Networks (CNN) is proposed to address this growing challenge. The proposed solution can recognize and forecast a letter using an image training model. The model peaked at the 15th epoch, with an accuracy rate of 92% and validation accuracy of 93%. The study aims to bridge the gaps between the deaf community with the hearing in the Philippines.","PeriodicalId":373752,"journal":{"name":"2023 International Conference on Electronics, Information, and Communication (ICEIC)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-02-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Electronics, Information, and Communication (ICEIC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICEIC57457.2023.10049890","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Deaf people communicate best through sign language. It is one of the most vital languages worldwide. Sign languages, like spoken languages, are sophisticated, naturally formed structures that are arranged around a set of conversational activities. In the Philippines, practicing Filipino Sign Language (FSL) already had improved communication for deaf people. However, the community’s main difficulty is that most Filipinos do not comprehend or use FSL. Within the deaf community, these gestures, whether ASL or FSL, are still limited. People who are normal and hearing would never attempt to learn sign language. This results in a significant communication gap between the deaf and the hearing. Human translators are needed in order for deaf people to be understood. While they can be helpful, they are not always available or attainable. Filipino Sign Language Alphabet Recognition using Convolutional Neural Networks (CNN) is proposed to address this growing challenge. The proposed solution can recognize and forecast a letter using an image training model. The model peaked at the 15th epoch, with an accuracy rate of 92% and validation accuracy of 93%. The study aims to bridge the gaps between the deaf community with the hearing in the Philippines.