Design and Evaluation of a Mixed Reality-based Human-Robot Interface for Teleoperation of Omnidirectional Aerial Vehicles

Mike Allenspach, Till Kötter, Rik Bähnemann, M. Tognon, R. Siegwart
{"title":"Design and Evaluation of a Mixed Reality-based Human-Robot Interface for Teleoperation of Omnidirectional Aerial Vehicles","authors":"Mike Allenspach, Till Kötter, Rik Bähnemann, M. Tognon, R. Siegwart","doi":"10.1109/ICUAS57906.2023.10156426","DOIUrl":null,"url":null,"abstract":"Omnidirectional aerial vehicles are an attractive solution for visual inspection tasks that require observations from different views. However, the decisional autonomy of modern robots is limited. Therefore, human input is often necessary to safely explore complex industrial environments. Existing teleoperation tools rely on on-board camera views or 3D renderings of the environment to improve situational awareness. Mixed-Reality (MR) offers an exciting alternative, allowing the user to perceive and control the robot’s motion in the physical world. Furthermore, since MR technology is not limited by the hardware constraints of standard teleoperation interfaces, like haptic devices or joysticks, it allows us to explore new reference generation and user feedback methodologies. In this work, we investigate the potential of MR in teleoperating 6DoF aerial robots by designing a holographic user interface (see Fig. 1) to control their translational velocity and orientation. A user study with 13 participants is performed to assess the proposed approach. The evaluation confirms the effectiveness and intuitiveness of our methodology, independent of prior user experience with aerial vehicles or MR. However, prior familiarity with MR improves task completion time. The results also highlight limitation to line-of-sight operation at distances where relevant details in the physical environment can still be visually distinguished.","PeriodicalId":379073,"journal":{"name":"2023 International Conference on Unmanned Aircraft Systems (ICUAS)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 International Conference on Unmanned Aircraft Systems (ICUAS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICUAS57906.2023.10156426","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Omnidirectional aerial vehicles are an attractive solution for visual inspection tasks that require observations from different views. However, the decisional autonomy of modern robots is limited. Therefore, human input is often necessary to safely explore complex industrial environments. Existing teleoperation tools rely on on-board camera views or 3D renderings of the environment to improve situational awareness. Mixed-Reality (MR) offers an exciting alternative, allowing the user to perceive and control the robot’s motion in the physical world. Furthermore, since MR technology is not limited by the hardware constraints of standard teleoperation interfaces, like haptic devices or joysticks, it allows us to explore new reference generation and user feedback methodologies. In this work, we investigate the potential of MR in teleoperating 6DoF aerial robots by designing a holographic user interface (see Fig. 1) to control their translational velocity and orientation. A user study with 13 participants is performed to assess the proposed approach. The evaluation confirms the effectiveness and intuitiveness of our methodology, independent of prior user experience with aerial vehicles or MR. However, prior familiarity with MR improves task completion time. The results also highlight limitation to line-of-sight operation at distances where relevant details in the physical environment can still be visually distinguished.
基于混合现实的全向飞行器遥操作人机界面设计与评价
对于需要从不同角度观察的目视检查任务,全向飞行器是一个有吸引力的解决方案。然而,现代机器人的决策自主性是有限的。因此,为了安全地探索复杂的工业环境,人类的投入往往是必要的。现有的远程操作工具依赖于车载摄像头视图或环境的3D渲染来提高态势感知。混合现实(MR)提供了一个令人兴奋的选择,允许用户感知和控制机器人在物理世界中的运动。此外,由于MR技术不受标准远程操作接口的硬件限制,如触觉设备或操纵杆,它允许我们探索新的参考生成和用户反馈方法。在这项工作中,我们通过设计一个全息用户界面(见图1)来控制它们的平移速度和方向,研究MR在远程操作6DoF空中机器人中的潜力。对13名参与者进行了用户研究,以评估拟议的方法。评估证实了我们方法的有效性和直观性,独立于先前用户对飞行器或MR的经验。然而,先前对MR的熟悉可以提高任务完成时间。研究结果还强调了在距离上视距操作的局限性,在距离上,物理环境中的相关细节仍然可以通过视觉区分。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信