Marco Alfonse, Amira T. Ali, A. S. Elons, N. Badr, Magdy Aboul-Ela
{"title":"Arabic sign language benchmark database for different heterogeneous sensors","authors":"Marco Alfonse, Amira T. Ali, A. S. Elons, N. Badr, Magdy Aboul-Ela","doi":"10.1109/ICTA.2015.7426902","DOIUrl":null,"url":null,"abstract":"The lack of a visualized representation for standard Arabic Sign Language (ArSL) makes it difficult to do something as common place as looking up an unknown word in a dictionary. The majority of printed dictionaries organize ArSL signs (represented in drawings or pictures) based on their nearest Arabic translation; so unless one already knows the meaning of Arabic sign, dictionary look-up is not a simple proposition. In order to build Arabic signs recognition system, standard database of Arabic signs is required to validate the results, however, this database is absent. In this paper, we introduce the ArSL database, a large and expanding public dataset containing video sequences of thousands of distinct ArSL signs. The database is collected using digital camera, Microsoft Kinect 2 sensor and Leap motion tracking sensor. This dataset is created as part of a project to develop an Arabic sign language translator. This database contents and variations qualifies it to be a benchmark database for ArSL in both research and commercial purposes.","PeriodicalId":375443,"journal":{"name":"2015 5th International Conference on Information & Communication Technology and Accessibility (ICTA)","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 5th International Conference on Information & Communication Technology and Accessibility (ICTA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICTA.2015.7426902","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
The lack of a visualized representation for standard Arabic Sign Language (ArSL) makes it difficult to do something as common place as looking up an unknown word in a dictionary. The majority of printed dictionaries organize ArSL signs (represented in drawings or pictures) based on their nearest Arabic translation; so unless one already knows the meaning of Arabic sign, dictionary look-up is not a simple proposition. In order to build Arabic signs recognition system, standard database of Arabic signs is required to validate the results, however, this database is absent. In this paper, we introduce the ArSL database, a large and expanding public dataset containing video sequences of thousands of distinct ArSL signs. The database is collected using digital camera, Microsoft Kinect 2 sensor and Leap motion tracking sensor. This dataset is created as part of a project to develop an Arabic sign language translator. This database contents and variations qualifies it to be a benchmark database for ArSL in both research and commercial purposes.