{"title":"基于卷积神经网络的复杂背景下仿人机器人手势识别","authors":"Ali Yildiz, N. G. Adar, A. Mert","doi":"10.34028/iajit/20/3/9","DOIUrl":null,"url":null,"abstract":"Hand gesture recognition is a preferred way for human-robot interactions. Conventional approaches are generally based on image processing and recognition of hand poses with simple backgrounds. In this paper, we propose deep learning models, and humanoid robot integration for offline and online (real-time) recognition and control using hand gestures. One thousand and two hundred of hand images belonging to four participants are collected to construct the hand gesture database. Five class (forward, backward, right, left and stop) images in six sophisticated backgrounds with different illumination levels are obtained for four participants, and then one participant's images are kept as testing data. A lightweight Convolutional Neural Network (CNN), and transfer learning techniques using VGG16, and Mobilenetv2 are performed on this database to evaluate user independent performance of the hand gesture system. After offline training, real-time implementation is designed using a mobile phone (Wi-Fi and camera), Wi-Fi router, computer with embedded deep learning algorithms, and NAO humanoid robot. Streamed video by the mobile phone is processed and recognized using the proposed deep algorithm in the computer, and then command is transferred to robot via TCP/IP protocol. Thus, the NAO humanoid robot control using hand gesture in RGB and HSV color spaces is evaluated in sophisticated background, and the implementation of the system is presented. In our simulations, 95% and 100% accuracy rates are yielded for the lightweight CNN, and transfer learning, respectively.","PeriodicalId":13624,"journal":{"name":"Int. Arab J. Inf. Technol.","volume":"54 1","pages":"368-375"},"PeriodicalIF":0.0000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Convolutional neural network based hand gesture recognition in sophisticated background for humanoid robot control\",\"authors\":\"Ali Yildiz, N. G. Adar, A. Mert\",\"doi\":\"10.34028/iajit/20/3/9\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hand gesture recognition is a preferred way for human-robot interactions. Conventional approaches are generally based on image processing and recognition of hand poses with simple backgrounds. In this paper, we propose deep learning models, and humanoid robot integration for offline and online (real-time) recognition and control using hand gestures. One thousand and two hundred of hand images belonging to four participants are collected to construct the hand gesture database. Five class (forward, backward, right, left and stop) images in six sophisticated backgrounds with different illumination levels are obtained for four participants, and then one participant's images are kept as testing data. A lightweight Convolutional Neural Network (CNN), and transfer learning techniques using VGG16, and Mobilenetv2 are performed on this database to evaluate user independent performance of the hand gesture system. After offline training, real-time implementation is designed using a mobile phone (Wi-Fi and camera), Wi-Fi router, computer with embedded deep learning algorithms, and NAO humanoid robot. Streamed video by the mobile phone is processed and recognized using the proposed deep algorithm in the computer, and then command is transferred to robot via TCP/IP protocol. Thus, the NAO humanoid robot control using hand gesture in RGB and HSV color spaces is evaluated in sophisticated background, and the implementation of the system is presented. In our simulations, 95% and 100% accuracy rates are yielded for the lightweight CNN, and transfer learning, respectively.\",\"PeriodicalId\":13624,\"journal\":{\"name\":\"Int. Arab J. Inf. Technol.\",\"volume\":\"54 1\",\"pages\":\"368-375\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Int. Arab J. Inf. Technol.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.34028/iajit/20/3/9\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. Arab J. Inf. Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.34028/iajit/20/3/9","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Convolutional neural network based hand gesture recognition in sophisticated background for humanoid robot control
Hand gesture recognition is a preferred way for human-robot interactions. Conventional approaches are generally based on image processing and recognition of hand poses with simple backgrounds. In this paper, we propose deep learning models, and humanoid robot integration for offline and online (real-time) recognition and control using hand gestures. One thousand and two hundred of hand images belonging to four participants are collected to construct the hand gesture database. Five class (forward, backward, right, left and stop) images in six sophisticated backgrounds with different illumination levels are obtained for four participants, and then one participant's images are kept as testing data. A lightweight Convolutional Neural Network (CNN), and transfer learning techniques using VGG16, and Mobilenetv2 are performed on this database to evaluate user independent performance of the hand gesture system. After offline training, real-time implementation is designed using a mobile phone (Wi-Fi and camera), Wi-Fi router, computer with embedded deep learning algorithms, and NAO humanoid robot. Streamed video by the mobile phone is processed and recognized using the proposed deep algorithm in the computer, and then command is transferred to robot via TCP/IP protocol. Thus, the NAO humanoid robot control using hand gesture in RGB and HSV color spaces is evaluated in sophisticated background, and the implementation of the system is presented. In our simulations, 95% and 100% accuracy rates are yielded for the lightweight CNN, and transfer learning, respectively.