An Integration of Myo Armbands and an Android-Based Mobile Application for Communication with Hearing-Impaired Persons

Malika Vachirapipop, Safra Soymat, Wasurat Tiraronnakul, Narit Hnoohom
{"title":"An Integration of Myo Armbands and an Android-Based Mobile Application for Communication with Hearing-Impaired Persons","authors":"Malika Vachirapipop, Safra Soymat, Wasurat Tiraronnakul, Narit Hnoohom","doi":"10.1109/SITIS.2017.74","DOIUrl":null,"url":null,"abstract":"Hearing-impaired people or those with any other disabilities lack support and help. For the hearing-impaired, communication to the rest of the world is limited due to the limited number of interpreters. With our knowledge, this problem has motivated the researchers to create a medium to support these people and provide them with the ability to communicate freely in different situations. This paper includes two main actors; namely, Myo armbands and a mobile application. The Myo armbands capture muscular movements and send the captured frequency as an input to the Android-based mobile application in which then, through the embedded prediction model, the mapping of the input data occurs. Once the translation was completed, it was sent back to the mobile screen where the translation was displayed. The application was built to translate a total of six gestures, where they were then classified into three categories; namely, daily communication, illness and emergency situations. The accuracy of the application for being able to translate the gestures into the correct meaning was tested by 12 users and resulted in the accuracy of five out of six signs. The result has, however, shown some gestures to be confused with others. The gestures that were seen confused with each other was sorry and help. The reason for this confusion can be concluded in two main points. The first being too few data sets that were used for training, and second being the gestures had a close posture, i.e. the position, height and orientation of the hands. This problem was, however, able to be solved by gesture performance guidance.","PeriodicalId":153165,"journal":{"name":"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SITIS.2017.74","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Hearing-impaired people or those with any other disabilities lack support and help. For the hearing-impaired, communication to the rest of the world is limited due to the limited number of interpreters. With our knowledge, this problem has motivated the researchers to create a medium to support these people and provide them with the ability to communicate freely in different situations. This paper includes two main actors; namely, Myo armbands and a mobile application. The Myo armbands capture muscular movements and send the captured frequency as an input to the Android-based mobile application in which then, through the embedded prediction model, the mapping of the input data occurs. Once the translation was completed, it was sent back to the mobile screen where the translation was displayed. The application was built to translate a total of six gestures, where they were then classified into three categories; namely, daily communication, illness and emergency situations. The accuracy of the application for being able to translate the gestures into the correct meaning was tested by 12 users and resulted in the accuracy of five out of six signs. The result has, however, shown some gestures to be confused with others. The gestures that were seen confused with each other was sorry and help. The reason for this confusion can be concluded in two main points. The first being too few data sets that were used for training, and second being the gestures had a close posture, i.e. the position, height and orientation of the hands. This problem was, however, able to be solved by gesture performance guidance.
Myo臂章与基于android的移动应用程序的集成,用于听力受损人士的交流
听力受损的人或有其他残疾的人缺乏支持和帮助。对于听障人士来说,由于口译员数量有限,他们与世界其他地方的交流受到限制。根据我们的知识,这个问题促使研究人员创造一种媒体来支持这些人,并为他们提供在不同情况下自由交流的能力。本文包括两个主要参与者;也就是Myo臂章和一个移动应用程序。Myo臂带捕捉肌肉运动,并将捕捉到的频率作为输入发送到基于android的移动应用程序,然后通过嵌入式预测模型,对输入数据进行映射。翻译完成后,它被发送回显示翻译的移动屏幕。该应用程序可以翻译总共六种手势,然后将它们分为三类;即日常通讯、疾病和紧急情况。12名用户测试了应用程序将手势翻译成正确意思的准确性,结果6个手势中有5个是准确的。然而,结果显示,有些手势容易与其他手势混淆。相互混淆的手势是抱歉和帮助。造成这种混乱的原因可以归纳为两点。首先是用于训练的数据集太少,其次是手势有一个接近的姿势,即手的位置,高度和方向。然而,这个问题可以通过手势性能指导来解决。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信