Theodore Bismo Waskito, S. Sumaryo, C. Setianingsih
{"title":"Wheeled Robot Control with Hand Gesture based on Image Processing","authors":"Theodore Bismo Waskito, S. Sumaryo, C. Setianingsih","doi":"10.1109/IAICT50021.2020.9172032","DOIUrl":null,"url":null,"abstract":"Computer vision based on shape recognition has a lot of potential in human and computer interaction. Hand gestures can be used as symbols of human interaction with computers which are preferred in the use of various hand gestures in sign language. Various tasks can be used to set remote control functions, control robots, and so on. The process of processing images or hand drawings using computer vision is called image processing. In this paper, a wheeled robot control system can be moved according to the given hand gesture commands. There are 6 forms of hand gestures that are made as input, and each hand gesture gives one command for the movement of a wheeled robot. The method used to classify each hand gesture, namely Convolutional Neural Network (CNN). CNN is a branch of the Artificial Neural Network (ANN) that can perform extraction features and create desired categories. The results of the classification will be carried out and sent to a wireless robot to run a movement. The result of this system is the movement of the wheeled robot following the given hand gestures. Variables that affect this system are training parameters and environmental parameters which include the amount of light intensity, distance, and tilt angle. The accuracy of the entire system obtained is 91.33%.","PeriodicalId":433718,"journal":{"name":"2020 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT)","volume":"92 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Industry 4.0, Artificial Intelligence, and Communications Technology (IAICT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAICT50021.2020.9172032","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
Computer vision based on shape recognition has a lot of potential in human and computer interaction. Hand gestures can be used as symbols of human interaction with computers which are preferred in the use of various hand gestures in sign language. Various tasks can be used to set remote control functions, control robots, and so on. The process of processing images or hand drawings using computer vision is called image processing. In this paper, a wheeled robot control system can be moved according to the given hand gesture commands. There are 6 forms of hand gestures that are made as input, and each hand gesture gives one command for the movement of a wheeled robot. The method used to classify each hand gesture, namely Convolutional Neural Network (CNN). CNN is a branch of the Artificial Neural Network (ANN) that can perform extraction features and create desired categories. The results of the classification will be carried out and sent to a wireless robot to run a movement. The result of this system is the movement of the wheeled robot following the given hand gestures. Variables that affect this system are training parameters and environmental parameters which include the amount of light intensity, distance, and tilt angle. The accuracy of the entire system obtained is 91.33%.