{"title":"Myo臂章与基于android的移动应用程序的集成,用于听力受损人士的交流","authors":"Malika Vachirapipop, Safra Soymat, Wasurat Tiraronnakul, Narit Hnoohom","doi":"10.1109/SITIS.2017.74","DOIUrl":null,"url":null,"abstract":"Hearing-impaired people or those with any other disabilities lack support and help. For the hearing-impaired, communication to the rest of the world is limited due to the limited number of interpreters. With our knowledge, this problem has motivated the researchers to create a medium to support these people and provide them with the ability to communicate freely in different situations. This paper includes two main actors; namely, Myo armbands and a mobile application. The Myo armbands capture muscular movements and send the captured frequency as an input to the Android-based mobile application in which then, through the embedded prediction model, the mapping of the input data occurs. Once the translation was completed, it was sent back to the mobile screen where the translation was displayed. The application was built to translate a total of six gestures, where they were then classified into three categories; namely, daily communication, illness and emergency situations. The accuracy of the application for being able to translate the gestures into the correct meaning was tested by 12 users and resulted in the accuracy of five out of six signs. The result has, however, shown some gestures to be confused with others. The gestures that were seen confused with each other was sorry and help. The reason for this confusion can be concluded in two main points. The first being too few data sets that were used for training, and second being the gestures had a close posture, i.e. the position, height and orientation of the hands. This problem was, however, able to be solved by gesture performance guidance.","PeriodicalId":153165,"journal":{"name":"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)","volume":"27 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"An Integration of Myo Armbands and an Android-Based Mobile Application for Communication with Hearing-Impaired Persons\",\"authors\":\"Malika Vachirapipop, Safra Soymat, Wasurat Tiraronnakul, Narit Hnoohom\",\"doi\":\"10.1109/SITIS.2017.74\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hearing-impaired people or those with any other disabilities lack support and help. For the hearing-impaired, communication to the rest of the world is limited due to the limited number of interpreters. With our knowledge, this problem has motivated the researchers to create a medium to support these people and provide them with the ability to communicate freely in different situations. This paper includes two main actors; namely, Myo armbands and a mobile application. The Myo armbands capture muscular movements and send the captured frequency as an input to the Android-based mobile application in which then, through the embedded prediction model, the mapping of the input data occurs. Once the translation was completed, it was sent back to the mobile screen where the translation was displayed. The application was built to translate a total of six gestures, where they were then classified into three categories; namely, daily communication, illness and emergency situations. The accuracy of the application for being able to translate the gestures into the correct meaning was tested by 12 users and resulted in the accuracy of five out of six signs. The result has, however, shown some gestures to be confused with others. The gestures that were seen confused with each other was sorry and help. The reason for this confusion can be concluded in two main points. The first being too few data sets that were used for training, and second being the gestures had a close posture, i.e. the position, height and orientation of the hands. This problem was, however, able to be solved by gesture performance guidance.\",\"PeriodicalId\":153165,\"journal\":{\"name\":\"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)\",\"volume\":\"27 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SITIS.2017.74\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 13th International Conference on Signal-Image Technology & Internet-Based Systems (SITIS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SITIS.2017.74","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Integration of Myo Armbands and an Android-Based Mobile Application for Communication with Hearing-Impaired Persons
Hearing-impaired people or those with any other disabilities lack support and help. For the hearing-impaired, communication to the rest of the world is limited due to the limited number of interpreters. With our knowledge, this problem has motivated the researchers to create a medium to support these people and provide them with the ability to communicate freely in different situations. This paper includes two main actors; namely, Myo armbands and a mobile application. The Myo armbands capture muscular movements and send the captured frequency as an input to the Android-based mobile application in which then, through the embedded prediction model, the mapping of the input data occurs. Once the translation was completed, it was sent back to the mobile screen where the translation was displayed. The application was built to translate a total of six gestures, where they were then classified into three categories; namely, daily communication, illness and emergency situations. The accuracy of the application for being able to translate the gestures into the correct meaning was tested by 12 users and resulted in the accuracy of five out of six signs. The result has, however, shown some gestures to be confused with others. The gestures that were seen confused with each other was sorry and help. The reason for this confusion can be concluded in two main points. The first being too few data sets that were used for training, and second being the gestures had a close posture, i.e. the position, height and orientation of the hands. This problem was, however, able to be solved by gesture performance guidance.