Dina A. Alabbad, Nouha O. Alsaleh, Naimah A. Alaqeel, Yara A. Alshehri, Nashwa A. Alzahrani, Maha K. Alhobaishi
{"title":"基于机器人的阿拉伯手语翻译系统","authors":"Dina A. Alabbad, Nouha O. Alsaleh, Naimah A. Alaqeel, Yara A. Alshehri, Nashwa A. Alzahrani, Maha K. Alhobaishi","doi":"10.1109/CDMA54072.2022.00030","DOIUrl":null,"url":null,"abstract":"Services provided to deaf people in the Eastern province of Saudi Arabia were evaluated, which confirmed a high need to support the deaf community. This paper proposes utilizing the Pepper robot in the task of recognizing and translating Arabic sign language (ArSL), by which the robot recognizes static hand gestures of the letters in ArSL from each keyframe extracted from the input video and translate it into written text and vice versa. This project aims to conduct a two-way translation of the Arabic sign language in a way that fulfills the communication gap found in Saudi Arabia among deaf and non-deaf people. The methods proposed in this paper are computer vision to use the pepper robot's camera and sensors, Natural language processing to convert natural speech to sign language and Deep learning to build a convolutional neural network model that classifies the sign language gestures and convert them into their corresponding written and spoken form. Moreover, two datasets were used, first one is a collection of hand gestures for training the model and the other one is 39 animated signs of all the Arabic letters and special letters.","PeriodicalId":313042,"journal":{"name":"2022 7th International Conference on Data Science and Machine Learning Applications (CDMA)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"A Robot-based Arabic Sign Language Translating System\",\"authors\":\"Dina A. Alabbad, Nouha O. Alsaleh, Naimah A. Alaqeel, Yara A. Alshehri, Nashwa A. Alzahrani, Maha K. Alhobaishi\",\"doi\":\"10.1109/CDMA54072.2022.00030\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Services provided to deaf people in the Eastern province of Saudi Arabia were evaluated, which confirmed a high need to support the deaf community. This paper proposes utilizing the Pepper robot in the task of recognizing and translating Arabic sign language (ArSL), by which the robot recognizes static hand gestures of the letters in ArSL from each keyframe extracted from the input video and translate it into written text and vice versa. This project aims to conduct a two-way translation of the Arabic sign language in a way that fulfills the communication gap found in Saudi Arabia among deaf and non-deaf people. The methods proposed in this paper are computer vision to use the pepper robot's camera and sensors, Natural language processing to convert natural speech to sign language and Deep learning to build a convolutional neural network model that classifies the sign language gestures and convert them into their corresponding written and spoken form. Moreover, two datasets were used, first one is a collection of hand gestures for training the model and the other one is 39 animated signs of all the Arabic letters and special letters.\",\"PeriodicalId\":313042,\"journal\":{\"name\":\"2022 7th International Conference on Data Science and Machine Learning Applications (CDMA)\",\"volume\":\"38 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 7th International Conference on Data Science and Machine Learning Applications (CDMA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CDMA54072.2022.00030\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 7th International Conference on Data Science and Machine Learning Applications (CDMA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CDMA54072.2022.00030","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Robot-based Arabic Sign Language Translating System
Services provided to deaf people in the Eastern province of Saudi Arabia were evaluated, which confirmed a high need to support the deaf community. This paper proposes utilizing the Pepper robot in the task of recognizing and translating Arabic sign language (ArSL), by which the robot recognizes static hand gestures of the letters in ArSL from each keyframe extracted from the input video and translate it into written text and vice versa. This project aims to conduct a two-way translation of the Arabic sign language in a way that fulfills the communication gap found in Saudi Arabia among deaf and non-deaf people. The methods proposed in this paper are computer vision to use the pepper robot's camera and sensors, Natural language processing to convert natural speech to sign language and Deep learning to build a convolutional neural network model that classifies the sign language gestures and convert them into their corresponding written and spoken form. Moreover, two datasets were used, first one is a collection of hand gestures for training the model and the other one is 39 animated signs of all the Arabic letters and special letters.