{"title":"基于机器学习和Leap运动控制器的美国手语自动翻译创新方法","authors":"Jon Jenkins, S. Rashad","doi":"10.1109/UEMCON53757.2021.9666640","DOIUrl":null,"url":null,"abstract":"Millions of people globally use some form of sign language in their everyday lives. There is a need for a method of gesture recognition that is as easy to use and ubiquitous as voice recognition is today. In this paper we explore a way to translate from sign language to speech using an innovative method, utilizing the Leap Motion Controller and machine learning algorithms to capture and analyze hand movements in real time, then converting the interpreted signs into spoken word. We seek to build a system that is easy to use, intuitive to understand, adaptable to the individual, and usable in everyday life. This system will be able to work in an adaptive way to learn new signs to expand the dictionary of the system and allow higher accuracy on an individual level. It will have a wide range of applications for healthcare, education, gamification, communication, and more. An optical hand tracking piece of hardware, the Leap Motion Controller will be used to capture hand movements and information to create supervised machine learning models that can be trained to accurately guess American Sign Language (ASL) symbols being signed in real time. Experimental results show that the proposed method is promising and provides a high level of accuracy in recognizing ASL.","PeriodicalId":127072,"journal":{"name":"2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"An Innovative Method for Automatic American Sign Language Interpretation using Machine Learning and Leap Motion Controller\",\"authors\":\"Jon Jenkins, S. Rashad\",\"doi\":\"10.1109/UEMCON53757.2021.9666640\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Millions of people globally use some form of sign language in their everyday lives. There is a need for a method of gesture recognition that is as easy to use and ubiquitous as voice recognition is today. In this paper we explore a way to translate from sign language to speech using an innovative method, utilizing the Leap Motion Controller and machine learning algorithms to capture and analyze hand movements in real time, then converting the interpreted signs into spoken word. We seek to build a system that is easy to use, intuitive to understand, adaptable to the individual, and usable in everyday life. This system will be able to work in an adaptive way to learn new signs to expand the dictionary of the system and allow higher accuracy on an individual level. It will have a wide range of applications for healthcare, education, gamification, communication, and more. An optical hand tracking piece of hardware, the Leap Motion Controller will be used to capture hand movements and information to create supervised machine learning models that can be trained to accurately guess American Sign Language (ASL) symbols being signed in real time. Experimental results show that the proposed method is promising and provides a high level of accuracy in recognizing ASL.\",\"PeriodicalId\":127072,\"journal\":{\"name\":\"2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)\",\"volume\":\"16 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/UEMCON53757.2021.9666640\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 12th Annual Ubiquitous Computing, Electronics & Mobile Communication Conference (UEMCON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/UEMCON53757.2021.9666640","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An Innovative Method for Automatic American Sign Language Interpretation using Machine Learning and Leap Motion Controller
Millions of people globally use some form of sign language in their everyday lives. There is a need for a method of gesture recognition that is as easy to use and ubiquitous as voice recognition is today. In this paper we explore a way to translate from sign language to speech using an innovative method, utilizing the Leap Motion Controller and machine learning algorithms to capture and analyze hand movements in real time, then converting the interpreted signs into spoken word. We seek to build a system that is easy to use, intuitive to understand, adaptable to the individual, and usable in everyday life. This system will be able to work in an adaptive way to learn new signs to expand the dictionary of the system and allow higher accuracy on an individual level. It will have a wide range of applications for healthcare, education, gamification, communication, and more. An optical hand tracking piece of hardware, the Leap Motion Controller will be used to capture hand movements and information to create supervised machine learning models that can be trained to accurately guess American Sign Language (ASL) symbols being signed in real time. Experimental results show that the proposed method is promising and provides a high level of accuracy in recognizing ASL.