{"title":"利用平板电脑上的移动增强现实界面实现基于远程操作的机器人运动学交互学习","authors":"J. Frank, V. Kapila","doi":"10.1109/INDIANCC.2016.7441163","DOIUrl":null,"url":null,"abstract":"The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.","PeriodicalId":286356,"journal":{"name":"2016 Indian Control Conference (ICC)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet\",\"authors\":\"J. Frank, V. Kapila\",\"doi\":\"10.1109/INDIANCC.2016.7441163\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.\",\"PeriodicalId\":286356,\"journal\":{\"name\":\"2016 Indian Control Conference (ICC)\",\"volume\":\"30 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-28\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 Indian Control Conference (ICC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/INDIANCC.2016.7441163\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 Indian Control Conference (ICC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/INDIANCC.2016.7441163","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards teleoperation-based interactive learning of robot kinematics using a mobile augmented reality interface on a tablet
The integration of augmented reality (AR) techniques in user interface design has enhanced interactive experiences in teleoperation of robots, hands-on learning in classrooms, laboratory, and special education, and user training in an array of fields, e.g., aerospace, automotive, construction, manufacturing, medical, etc. However, AR-based user interfaces that command machines and tools have not been fully explored for their potential to enhance interactive learning of engineering concepts in the laboratory. This paper outlines the development of a mobile application executing on a tablet device, which renders an immersive AR-based graphical user interface to enable users to monitor, interact with, and control a four-link underactuated planar robot. Computer vision routines are used to extract real-time, vision-based measurements of the robot's joint angles and end effector location from the live video captured by the rear-facing camera on the tablet. The obtained measurements are used to render AR content to offer users with additional visual feedback. Touch gesture recognition is implemented to allow users to naturally and intuitively command the robot by tapping and dragging their fingers at desired locations on the tablet screen. Experimental results show the performance and efficacy of the proposed system as it is operated in two different modes: one in which the user has direct control over the angles of the actuated links of the robot and one in which the user has direct control over the end effector location.