{"title":"Hand Gesture Identification and Voice Command Based Hardware Reduction","authors":"Amogh Godbole, Vishal Gondke, K. Devadkar","doi":"10.1109/PuneCon55413.2022.10014943","DOIUrl":null,"url":null,"abstract":"The use of physical devices like mouse and keyboard to communicate to our computer has hindered the natural interface between the computer and its user. To eradicate this barrier, we have designed a virtual mouse which will act upon the various hand gestures the user provides and will help them interact with the computer in a better way. The virtual mouse control of our system will keep a track of the movements of the fingers and palm, thus providing us with a specific output based on the recognized action of our hand. The proposed model will recognize both static as well as dynamic hand gestures and the results will show that the model that has been proposed and built will intuitively interact with the computer and it can be achieved the minimum usage of any hardware requirements. Our model will also have a major advantage over the current existing models, which is, it also has the feature of voice command recognition, meaning that even if the user if not able to provide appropriate gestures using their hand due to some medical condition, they will still be able to interact with the computer via voice commands.","PeriodicalId":258640,"journal":{"name":"2022 IEEE Pune Section International Conference (PuneCon)","volume":"136 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Pune Section International Conference (PuneCon)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/PuneCon55413.2022.10014943","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The use of physical devices like mouse and keyboard to communicate to our computer has hindered the natural interface between the computer and its user. To eradicate this barrier, we have designed a virtual mouse which will act upon the various hand gestures the user provides and will help them interact with the computer in a better way. The virtual mouse control of our system will keep a track of the movements of the fingers and palm, thus providing us with a specific output based on the recognized action of our hand. The proposed model will recognize both static as well as dynamic hand gestures and the results will show that the model that has been proposed and built will intuitively interact with the computer and it can be achieved the minimum usage of any hardware requirements. Our model will also have a major advantage over the current existing models, which is, it also has the feature of voice command recognition, meaning that even if the user if not able to provide appropriate gestures using their hand due to some medical condition, they will still be able to interact with the computer via voice commands.