{"title":"Plastic ‘personalities’ for effective field swarms","authors":"Edmund R. Hunt","doi":"10.31256/xa2lf8k","DOIUrl":"https://doi.org/10.31256/xa2lf8k","url":null,"abstract":"Most studies on real-world multi-robot systems have been performed in controlled laboratory environments, whereas the real world is unpredictable and sometimes hazardous. I have recently suggested that the natural phenomenon of phenotypic plasticity provides a useful bioinspiration framework for making such systems more resilient in field conditions [1]. Phenotypic plasticity occurs when a single genotype produces a range of phenotypes (observable traits) in response to different environmental conditions. Consistent individual behavioural differences can result from such plasticity, and have been described as ‘personalities’. At the same time, in social animals, individual heterogeneity is increasingly recognised as functional for the group. We can exploit this functional heterogeneity as engineers trying to design field robot systems, and phenotypic plasticity can provide meaningful diversity ‘for free’, based on the local experience of agents. Personality axes such as bold–shy or social–asocial can be represented as single variables, with the advantage of being transparent and intuitive for human users, and predictable in their effects. For example, in a dangerous environment, robots may become more ‘shy’ and ‘social’ to stay closer together and out of harm’s way.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123643778","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards Intention Recognition for Human-Interacting Agricultural Robots","authors":"Alexander Gabriel, Paul E. Baxter","doi":"10.31256/ye5nz9w","DOIUrl":"https://doi.org/10.31256/ye5nz9w","url":null,"abstract":"—Robots sharing a common working space with humans and interacting with them to accomplish some task should not only optimise task efficiency, but also consider the safety and comfort of their human collaborators. This requires the recognition of human intentions in order for the robot to anticipate behaviour and act accordingly. In this paper we propose a robot behavioural controller that incorporates both human behaviour and environment information as the basis of reasoning over the appropriate responses. Applied to Human-Robot Interaction in an agricultural context, we demonstrate in a series of simulations how this proposed method leads to the production of appropriate robot behaviour in a range of interaction scenarios. This work lays the foundation for the wider consideration of contextual intention recognition for the generation of interactive robot behaviour.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"33 15","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141206548","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Determining shape of strawberry crops with spherical harmonics","authors":"Justin Le Louëdec, Grzegorz Cielniak","doi":"10.31256/mc8hl1a","DOIUrl":"https://doi.org/10.31256/mc8hl1a","url":null,"abstract":"—Shape descriptor and shape reconstruction are two challenges found in computer vision and graphics as well as in perception for robotics, especially for some fields such as agri- robotics (robotics for agriculture). Being able to offer a reliable description of shape that can also translate directly into an high fidelity model of the shape, would be of high importance for a lot of applications such as phenotyping or agronomy. In this paper we report on our work on using spherical harmonics to offer efficient representation of strawberry shapes and we validate them by reconstructing the fruits. The reconstruction achieve extremely close results to the original shape (less than 1% deviation) and the representation reduce the complexity and improve compactness by a large factor (minimum 100).","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114816439","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. Heiwolt, Willow Mandil, Grzegorz Cielniak, Marc Hanheide
{"title":"Automated Topological Mapping for Agricultural Robots","authors":"K. Heiwolt, Willow Mandil, Grzegorz Cielniak, Marc Hanheide","doi":"10.31256/ze8ex1v","DOIUrl":"https://doi.org/10.31256/ze8ex1v","url":null,"abstract":"Essential to agricultural robot deployment in farms are accurate topological maps, which are manually created in current systems. In this work we present a novel approach to automatically generate a topological map along crop rows from aerials images for the deployment of agricultural mobile robots. We evaluate our system in a digital twin of a farm environment using real-world textures and physical simulation, and also demonstrate its applicability to aerial images of a real farm.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130826783","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Goods and Bads in Dyadic Co-Manipulation: Identifying Conflict-Driven Interaction Behaviours in Human-Human Collaboration","authors":"Illimar Issak, Ayse Kucukyilmaz","doi":"10.31256/fv3gn1l","DOIUrl":"https://doi.org/10.31256/fv3gn1l","url":null,"abstract":"—One of the challenges in collaborative human-robot object transfer is the robot’s ability to infer about the interaction state and adapt to it in real time. During joint object transfer humans communicate about the interaction states through mul-tiple modalities and adapt to one another’s actions such that the interaction is successful. Knowledge of the current interaction state (i.e. harmonious, conflicting or passive interaction) can help us adjust our behaviour to carry out the task successfully. This study investigates the effectiveness of physical Human- Human Interaction (pHHI) forces for predicting interaction states during ongoing object co-manipulation. We use a sliding-window method for extracting features and perform online classification to infer the interaction states. Our dataset consists of haptic data from 40 subjects who are partnered to form 20 dyads. The dyads performed collaborative object transfer tasks in a haptics- enabled virtual environment to move an object to predefined goal configurations in different harmonious and conflicting scenarios. We evaluate our approach using multi-class Support Vector Machine classifier (SVMc) and Gaussian Process classifier (GPc) and achieve 80% accuracy for classifying general interaction types.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116141126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Luke Hickton, Matthew Lewis, K. Koay, Lola Cañamero
{"title":"Does Expression of Grounded Affect in a Hexapod Robot Elicit More Prosocial Responses?","authors":"Luke Hickton, Matthew Lewis, K. Koay, Lola Cañamero","doi":"10.31256/hz3ww4t","DOIUrl":"https://doi.org/10.31256/hz3ww4t","url":null,"abstract":"—We consider how non-humanoid robots can communicate their affective state via bodily forms of communication, and the extent to which this can influence human response. We propose a simple model of grounded affect and kinesic expression and outline two experiments (N=9 and N=180) in which participants were asked to watch expressive and non-expressive hexapod robots perform different ‘scenes’. Our pre-liminary findings suggest the expressive robot stimulated greater desire for interaction, and was more likely to be attributed with emotion. It also elicited more desire for prosocial behaviour.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"31 4","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141206575","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Human-in-the-Loop Adaptation and Reuse of Robot Assistance Policies for Ambient Assisted Living","authors":"Ronnie Smith","doi":"10.31256/fn5eb4a","DOIUrl":"https://doi.org/10.31256/fn5eb4a","url":null,"abstract":"—Personalisation and adaptation of Ambient Assisted Living (AAL) solutions is the subject of many existing works, which seek to embed and/or learn user preferences and needs. However, with a focus on lab-based evaluation, it has been easy to overlook the potential realities of what AAL might look like in the future: a range of heterogeneous platforms, devices, and robots will generate swathes of knowledge that must be shared, within and outside the home. As such, it is important to consider scalability and interoperability in every aspect of future AAL solutions. Adaptivity as a Service (AaaS) is proposed as a highly specialised service with a core function of personalising and adapting smart homes and AAL systems to the needs and wants of individuals.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"44 24","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141206097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Topological Robot Localization in a Pipe Network","authors":"R. Worley, S. Anderson","doi":"10.31256/zw1wq5m","DOIUrl":"https://doi.org/10.31256/zw1wq5m","url":null,"abstract":"Topological localization is advantageous for robots with limited sensing ability in pipe networks, where localization is made difficult if a robot incorrectly executes an action and arrives at an unknown junction. Novel incorporation of measurement of distance travelled is used in a Hidden Markov Model based localization method, which is shown to improve accuracy.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114977677","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards Insect Inspired Visual Sensors for Robots","authors":"Blayze Millward, Steve C. Maddock, M. Mangan","doi":"10.31256/do2ik3h","DOIUrl":"https://doi.org/10.31256/do2ik3h","url":null,"abstract":"Flying insects display a repertoire of complex behaviours that are facilitated by their non-standard visual system that if understood would offer solutions for weight- and power- constrained robotic platforms such as micro unmanned aerial vehicles (MUAVs). Crucial to this goal is revealing the specific features of insect eyes that engineered solutions would benefit from possessing, however progress in exploration of the design space has been limited by challenges in accurately replicating insect vision. Here we propose that emerging ray-tracing technologies are ideally placed to realise the high-fidelity replication of the insect visual perspective in a rapid, modular and adaptive framework allowing development of technical specifications for a new class of bio-inspired sensor. A proof-of-principle insect eye renderer is shown and insights into research directions it affords discussed.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133508547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Rankin, L. Justham, Y. Goh, J. Morley, Jcb Excavators Ltd
{"title":"Task Delegation and Architecture for Autonomous Excavators","authors":"J. Rankin, L. Justham, Y. Goh, J. Morley, Jcb Excavators Ltd","doi":"10.31256/ew3zn4z","DOIUrl":"https://doi.org/10.31256/ew3zn4z","url":null,"abstract":"— The construction industry is required to deliver safe, productive machines. One method being considered by heavy equipment manufacturers is autonomy. Implementing autonomy to heavy machines is unique, due to the highly skilled nature of a machine’s operation meaning that different levels of autonomy may be more suitable for different tasks. Therefore, effective collaboration strategies between human operators and machines are needed. This paper proposes a machine architecture that considers the task delegation between the operator and machine.","PeriodicalId":393014,"journal":{"name":"UKRAS20 Conference: \"Robots into the real world\" Proceedings","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-05-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122844512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}