Jannek Steinke, Justus Rischke, Peter Sossalla, Johannes Hofer, Christian L. Vielhaus, Nico vom Hofe, F. Fitzek
{"title":"Demo: The Future of Dog Walking – Four-Legged Robots and Augmented Reality","authors":"Jannek Steinke, Justus Rischke, Peter Sossalla, Johannes Hofer, Christian L. Vielhaus, Nico vom Hofe, F. Fitzek","doi":"10.1109/WoWMoM57956.2023.00060","DOIUrl":null,"url":null,"abstract":"New generations of mobile networks are opening up novel possibilities for controlling robots remotely in real-time. With 5G’s requirement to support use cases that demand low latencies at a high reliability from the communication network, wireless control applications become feasible. A remote operator typically uses a handheld device with buttons or joysticks to control a mobile robot. Joysticks are widely used today. The limitations of two-dimensional controlling and displaying of camera data can cause difficulties. Augmented Reality (AR)-based control provides the ability to control in three dimensional space. Therefore, new user-friendly Human Machine Interfaces (HMIs) can improve the interaction with these robots. In this demonstration, we present a human-in-the-loop application with a novel HMI. With our HMI a remote operator controls a four-legged Boston Dynamics Spot robot with gestures while wearing the AR-Headset Microsoft HoloLens 2. The remote operator receives feedback from the robot in the form of live camera streams visualised on holographic screens.","PeriodicalId":132845,"journal":{"name":"2023 IEEE 24th International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE 24th International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WoWMoM57956.2023.00060","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
New generations of mobile networks are opening up novel possibilities for controlling robots remotely in real-time. With 5G’s requirement to support use cases that demand low latencies at a high reliability from the communication network, wireless control applications become feasible. A remote operator typically uses a handheld device with buttons or joysticks to control a mobile robot. Joysticks are widely used today. The limitations of two-dimensional controlling and displaying of camera data can cause difficulties. Augmented Reality (AR)-based control provides the ability to control in three dimensional space. Therefore, new user-friendly Human Machine Interfaces (HMIs) can improve the interaction with these robots. In this demonstration, we present a human-in-the-loop application with a novel HMI. With our HMI a remote operator controls a four-legged Boston Dynamics Spot robot with gestures while wearing the AR-Headset Microsoft HoloLens 2. The remote operator receives feedback from the robot in the form of live camera streams visualised on holographic screens.