{"title":"Gesture Interface and Transfer Method for AMR by Using Recognition of Pointing Direction and Object Recognition","authors":"T. Ikeda, Naoki Noda, S. Ueki, Hironao Yamada","doi":"10.20965/jrm.2023.p0288","DOIUrl":null,"url":null,"abstract":"This paper describes a gesture interface for a factory transfer robot. Our proposed interface used gesture recognition to recognize the pointing direction, instead of estimating the point as in conventional pointing gesture estimation. When the autonomous mobile robot (AMR) recognized the pointing direction, it performed position control based on the object recognition. The AMR traveled along our unique path to ensure that its camera detected the object to be referenced for position control. The experimental results confirmed that the position and angular errors of the AMR controlled with our interface were 0.058 m and 4.7° averaged over five subjects and two conditions, which were sufficiently accurate for transportation. A questionnaire showed that our interface was user-friendly compared with manual operation with a commercially available controller.","PeriodicalId":178614,"journal":{"name":"J. Robotics Mechatronics","volume":"21 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-04-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. Robotics Mechatronics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.20965/jrm.2023.p0288","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
This paper describes a gesture interface for a factory transfer robot. Our proposed interface used gesture recognition to recognize the pointing direction, instead of estimating the point as in conventional pointing gesture estimation. When the autonomous mobile robot (AMR) recognized the pointing direction, it performed position control based on the object recognition. The AMR traveled along our unique path to ensure that its camera detected the object to be referenced for position control. The experimental results confirmed that the position and angular errors of the AMR controlled with our interface were 0.058 m and 4.7° averaged over five subjects and two conditions, which were sufficiently accurate for transportation. A questionnaire showed that our interface was user-friendly compared with manual operation with a commercially available controller.