{"title":"Augmented Reality as a Medium for Human-Robot Collaborative Tasks","authors":"S. M. Chacko, V. Kapila","doi":"10.1109/RO-MAN46459.2019.8956466","DOIUrl":null,"url":null,"abstract":"This paper presents a novel augmented reality (AR) interaction method that allows a robot to perform manipulation of unknown physical objects in a human-robot collaborative working environment. A mobile AR application is developed to determine and communicate, in real-time, the position, orientation, and dimension of any random object in a robot manipulator’s workspace to perform pick-and-place operations. The proposed method is based on estimating the pose and size of the object by means of an AR virtual element superimposed on the live view of the real object. In particular, a semi-transparent AR element is created and manipulated through touch screen interactions to match with the pose and scale of the physical object to provide the information about that object. The resulting data is communicated to the robot manipulator to perform pick-and-place tasks. In this way, the AR virtual element acts as a medium of communication between a human and a robot. The performance of the proposed AR interface is assessed by conducting multiple trials with random objects, and it is observed that the robot successfully accomplishes tasks communicated through the AR virtual elements. The proposed interface is also tested with 20 users to determine the quality of user experience, followed by a poststudy survey. The participants reported that the AR interface is intuitive and easy to operate for manipulating physical objects of various sizes and shapes.","PeriodicalId":286478,"journal":{"name":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 28th IEEE International Conference on Robot and Human Interactive Communication (RO-MAN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/RO-MAN46459.2019.8956466","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9
Abstract
This paper presents a novel augmented reality (AR) interaction method that allows a robot to perform manipulation of unknown physical objects in a human-robot collaborative working environment. A mobile AR application is developed to determine and communicate, in real-time, the position, orientation, and dimension of any random object in a robot manipulator’s workspace to perform pick-and-place operations. The proposed method is based on estimating the pose and size of the object by means of an AR virtual element superimposed on the live view of the real object. In particular, a semi-transparent AR element is created and manipulated through touch screen interactions to match with the pose and scale of the physical object to provide the information about that object. The resulting data is communicated to the robot manipulator to perform pick-and-place tasks. In this way, the AR virtual element acts as a medium of communication between a human and a robot. The performance of the proposed AR interface is assessed by conducting multiple trials with random objects, and it is observed that the robot successfully accomplishes tasks communicated through the AR virtual elements. The proposed interface is also tested with 20 users to determine the quality of user experience, followed by a poststudy survey. The participants reported that the AR interface is intuitive and easy to operate for manipulating physical objects of various sizes and shapes.