{"title":"Augmented reality-assisted gesture-based teleoperated system for robot motion planning","authors":"Ahmed Eslam Salman, M. Roman","doi":"10.1108/ir-11-2022-0289","DOIUrl":null,"url":null,"abstract":"\nPurpose\nThe study proposed a human–robot interaction (HRI) framework to enable operators to communicate remotely with robots in a simple and intuitive way. The study focused on the situation when operators with no programming skills have to accomplish teleoperated tasks dealing with randomly localized different-sized objects in an unstructured environment. The purpose of this study is to reduce stress on operators, increase accuracy and reduce the time of task accomplishment. The special application of the proposed system is in the radioactive isotope production factories. The following approach combined the reactivity of the operator’s direct control with the powerful tools of vision-based object classification and localization.\n\n\nDesign/methodology/approach\nPerceptive real-time gesture control predicated on a Kinect sensor is formulated by information fusion between human intuitiveness and an augmented reality-based vision algorithm. Objects are localized using a developed feature-based vision algorithm, where the homography is estimated and Perspective-n-Point problem is solved. The 3D object position and orientation are stored in the robot end-effector memory for the last mission adjusting and waiting for a gesture control signal to autonomously pick/place an object. Object classification process is done using a one-shot Siamese neural network (NN) to train a proposed deep NN; other well-known models are also used in a comparison. The system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved.\n\n\nFindings\nThe system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved. The results revealed the effectiveness of the proposed teleoperation system and demonstrate its potential for use by robotics non-experienced users to effectively accomplish remote robot tasks.\n\n\nSocial implications\nThe proposed system reduces risk and increases level of safety when applied in hazardous environment such as the nuclear one.\n\n\nOriginality/value\nThe contribution and uniqueness of the presented study are represented in the development of a well-integrated HRI system that can tackle the four aforementioned circumstances in an effective and user-friendly way. High operator–robot reactivity is kept by using the direct control method, while a lot of cognitive stress is removed using elective/flapped autonomous mode to manipulate randomly localized different configuration objects. This necessitates building an effective deep learning algorithm (in comparison to well-known methods) to recognize objects in different conditions: illumination levels, shadows and different postures.\n","PeriodicalId":54987,"journal":{"name":"Industrial Robot-The International Journal of Robotics Research and Application","volume":null,"pages":null},"PeriodicalIF":1.9000,"publicationDate":"2023-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Industrial Robot-The International Journal of Robotics Research and Application","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1108/ir-11-2022-0289","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 1
Abstract
Purpose
The study proposed a human–robot interaction (HRI) framework to enable operators to communicate remotely with robots in a simple and intuitive way. The study focused on the situation when operators with no programming skills have to accomplish teleoperated tasks dealing with randomly localized different-sized objects in an unstructured environment. The purpose of this study is to reduce stress on operators, increase accuracy and reduce the time of task accomplishment. The special application of the proposed system is in the radioactive isotope production factories. The following approach combined the reactivity of the operator’s direct control with the powerful tools of vision-based object classification and localization.
Design/methodology/approach
Perceptive real-time gesture control predicated on a Kinect sensor is formulated by information fusion between human intuitiveness and an augmented reality-based vision algorithm. Objects are localized using a developed feature-based vision algorithm, where the homography is estimated and Perspective-n-Point problem is solved. The 3D object position and orientation are stored in the robot end-effector memory for the last mission adjusting and waiting for a gesture control signal to autonomously pick/place an object. Object classification process is done using a one-shot Siamese neural network (NN) to train a proposed deep NN; other well-known models are also used in a comparison. The system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved.
Findings
The system was contextualized in one of the nuclear industry applications: radioactive isotope production and its validation were performed through a user study where 10 participants of different backgrounds are involved. The results revealed the effectiveness of the proposed teleoperation system and demonstrate its potential for use by robotics non-experienced users to effectively accomplish remote robot tasks.
Social implications
The proposed system reduces risk and increases level of safety when applied in hazardous environment such as the nuclear one.
Originality/value
The contribution and uniqueness of the presented study are represented in the development of a well-integrated HRI system that can tackle the four aforementioned circumstances in an effective and user-friendly way. High operator–robot reactivity is kept by using the direct control method, while a lot of cognitive stress is removed using elective/flapped autonomous mode to manipulate randomly localized different configuration objects. This necessitates building an effective deep learning algorithm (in comparison to well-known methods) to recognize objects in different conditions: illumination levels, shadows and different postures.
期刊介绍:
Industrial Robot publishes peer reviewed research articles, technology reviews and specially commissioned case studies. Each issue includes high quality content covering all aspects of robotic technology, and reflecting the most interesting and strategically important research and development activities from around the world.
The journal’s policy of not publishing work that has only been tested in simulation means that only the very best and most practical research articles are included. This ensures that the material that is published has real relevance and value for commercial manufacturing and research organizations. Industrial Robot''s coverage includes, but is not restricted to:
Automatic assembly
Flexible manufacturing
Programming optimisation
Simulation and offline programming
Service robots
Autonomous robots
Swarm intelligence
Humanoid robots
Prosthetics and exoskeletons
Machine intelligence
Military robots
Underwater and aerial robots
Cooperative robots
Flexible grippers and tactile sensing
Robot vision
Teleoperation
Mobile robots
Search and rescue robots
Robot welding
Collision avoidance
Robotic machining
Surgical robots
Call for Papers 2020
AI for Autonomous Unmanned Systems
Agricultural Robot
Brain-Computer Interfaces for Human-Robot Interaction
Cooperative Robots
Robots for Environmental Monitoring
Rehabilitation Robots
Wearable Robotics/Exoskeletons.