Mustapha Deji Dere, Roshidat Oluwabukola Dere, Adewale Adesina, A. Yauri
{"title":"SmartCall: A Real-time, Sign Language Medical Emergency Communicator","authors":"Mustapha Deji Dere, Roshidat Oluwabukola Dere, Adewale Adesina, A. Yauri","doi":"10.1109/ITED56637.2022.10051420","DOIUrl":null,"url":null,"abstract":"Communication is essential for individuals to convey feelings and emotions. Persons with speech impairment, on the other hand, find it challenging to share their thoughts, especially during medical emergencies. In this study, we propose a low-cost embedded device that allows individuals with a speech impairment to communicate during medical emergencies. A 1D-convolution neural network (CNN) model extracting features from an onboard inertial measurement unit (IMU) for the classification of selected American sign language (ASL) medical emergencies word. The model was trained offline before deployment to a resource-constrained embedded device for real-time ASL word classification. A pilot test on two volunteers resulted in an offline accuracy of 91.2% and an average online accuracy of 92% for the 8-bit optimized model. The results demonstrate the feasibility to aid individuals with a speech impairment to communicate during medical emergencies. Furthermore, an extended application of the proposed design is for the intuitive learning of sign languages using artificial intelligence.","PeriodicalId":246041,"journal":{"name":"2022 5th Information Technology for Education and Development (ITED)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th Information Technology for Education and Development (ITED)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ITED56637.2022.10051420","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
Abstract
Communication is essential for individuals to convey feelings and emotions. Persons with speech impairment, on the other hand, find it challenging to share their thoughts, especially during medical emergencies. In this study, we propose a low-cost embedded device that allows individuals with a speech impairment to communicate during medical emergencies. A 1D-convolution neural network (CNN) model extracting features from an onboard inertial measurement unit (IMU) for the classification of selected American sign language (ASL) medical emergencies word. The model was trained offline before deployment to a resource-constrained embedded device for real-time ASL word classification. A pilot test on two volunteers resulted in an offline accuracy of 91.2% and an average online accuracy of 92% for the 8-bit optimized model. The results demonstrate the feasibility to aid individuals with a speech impairment to communicate during medical emergencies. Furthermore, an extended application of the proposed design is for the intuitive learning of sign languages using artificial intelligence.