{"title":"Automatic Engagement Detection for Social Situation Assessment","authors":"Nicola Webb, M. Giuliani, Séverin Lemaignan","doi":"10.31256/mm8xb7o","DOIUrl":"https://doi.org/10.31256/mm8xb7o","url":null,"abstract":"","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130518566","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. Peers, D. Kanoulas, B. Kaddouh, R. Richardson, Chengxu Zhou
{"title":"Dynamic Camera Usage in Mobile Teleoperation System for Buzz Wire Task","authors":"C. Peers, D. Kanoulas, B. Kaddouh, R. Richardson, Chengxu Zhou","doi":"10.31256/gn9mg2i","DOIUrl":"https://doi.org/10.31256/gn9mg2i","url":null,"abstract":"—Visual feedback is the most important form of perception within teleoperation, therefore there is a need for a solution that allows for increased potential information gain that a camera can provide, this can be obtained by having a camera that is able to move its position relatively to the base robot. Therefore, this paper focuses on the use of a drone to act as dynamic camera in teleoperation scenarios. The drone control is performed via the use of hand tracking through a wearable motion capture suit and is built upon an existing teleoperation control framework. The usability of the dynamic camera is demonstrated through the use of a simulated drone to act as a dynamic camera in a simulated buzz wire task.","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114424342","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Analysis of VR Usability in Mobile Manipulator Teleoperation","authors":"Yuhui Wan, C. Peers, D. Kanoulas, Chengxu Zhou","doi":"10.31256/fm3nb6g","DOIUrl":"https://doi.org/10.31256/fm3nb6g","url":null,"abstract":"—Hazardous materials incident responding and explo- sive ordnance disposal (EOD) are two of mobile manipulators’ most common deployment areas. Most of the time, these incidents happen in open environments and even ruin structures. There-fore, teleoperation is still the dominant robot control method. However, a direct line of sight can be limited. This study includes an experiment demonstrating a virtual reality device application in a real-world mobile manipulator EOD mission. The result support the feasibility of teleoperating a mobile manipulator using VR to complete complex missions.","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"138 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127456188","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Opening a Spring-Loaded Door with a Legged Manipulator","authors":"Jun Li, C. Peers, Songyan Xin, Chengxu Zhou","doi":"10.31256/to1ke3e","DOIUrl":"https://doi.org/10.31256/to1ke3e","url":null,"abstract":"","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122303856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design and development of a wheged nuclear robot","authors":"Dominic Murphy, M. Giuliani","doi":"10.31256/th2vi1k","DOIUrl":"https://doi.org/10.31256/th2vi1k","url":null,"abstract":"","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"96 9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126029083","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Short Survey on Recent State-of-the-Art Methods for Optimal Path Planning for Small On-Orbit Space Robots","authors":"Jonathan Arreola, Saurabh Upadhyay","doi":"10.31256/du2je8g","DOIUrl":"https://doi.org/10.31256/du2je8g","url":null,"abstract":"","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123866834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Utilising Weather and Terrain Data to Improve Autonomous Navigation","authors":"S. Heron, F. Labrosse, Patricia Shaw","doi":"10.31256/ji5of5z","DOIUrl":"https://doi.org/10.31256/ji5of5z","url":null,"abstract":"","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131359359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Amany Azevedo Amin, Efstathios Kagioulis, Norbert Domcsek, P. Graham, T. Nowotny, A. Philippides
{"title":"Robust View Based Navigation through View Classification","authors":"Amany Azevedo Amin, Efstathios Kagioulis, Norbert Domcsek, P. Graham, T. Nowotny, A. Philippides","doi":"10.31256/xq3eo4f","DOIUrl":"https://doi.org/10.31256/xq3eo4f","url":null,"abstract":"—Current implementations of view-based navigation on robots have shown success, but are limited to routes of < 10m [1] [2]. This is in part because current strategies do not take into account whether a view has been correctly recognised, moving in the most familiar direction given by the rotational familiarity function (RFF) regardless of prediction confidence. We demonstrate that it is possible to use the shape of the RFF to classify if the current view is from a known position, and thus likely to provide valid navigational information, or from a position which is unknown , aliased or occluded and therefore likely to result in erroneous movement. Our model could classify these four view types with accuracies of 1.00, 0.91, 0.97 and 0.87 respectively. We hope to use these results to extend online view-based navigation and prevent robot loss in complex environments.","PeriodicalId":144066,"journal":{"name":"UKRAS22 Conference \"Robotics for Unconstrained Environments\" Proceedings","volume":"372 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134231507","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}