A. West, Paul E. Coffey, Ioannis Tsitsimpelis, M. Aspinall, Nicholas Smith, M. Joyce, P. Martin, B. Lennox
{"title":"In-situ Optical Characterisation of Nuclear Environments","authors":"A. West, Paul E. Coffey, Ioannis Tsitsimpelis, M. Aspinall, Nicholas Smith, M. Joyce, P. Martin, B. Lennox","doi":"10.31256/ukras17.52","DOIUrl":"https://doi.org/10.31256/ukras17.52","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125684151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
F. Arvin, J. Mendoza, Benjamin Bird, A. West, S. Watson, B. Lennox
{"title":"Mona: an Affordable Mobile Robot for Swarm Robotic Applications","authors":"F. Arvin, J. Mendoza, Benjamin Bird, A. West, S. Watson, B. Lennox","doi":"10.31256/ukras17.16","DOIUrl":"https://doi.org/10.31256/ukras17.16","url":null,"abstract":"Mobile robots are playing a significant role in multi and swarm robotic research studies. The high cost of commercial mobile robots is a significant challenge that limits the number of swarm based research studies that implement real robotic platforms. On the other hand, the observed results from simulated robots using simulation software are not representative of results that would be obtained using real robots. There are therefore considerable benefits in the development of an affordable open-source and flexible platform that allows students and researchers to implement experiments using real robot systems. Mona is an open-source and open-hardware mobile robot that has been developed at the University of Manchester for this purpose. Mona provides a robotic solution that can be programmed and operated using a user-friendly interface, Arduino, with relative ease. The low cost of the platform means that it is feasible for a large number of these robots to be used in swarm robotic scenarios. This work was supported by EPSRC (Project No. EP/P01366X/1 and EP/P018505/1). Introduction Swarm robotics is a relatively new concept in multi-robotic collective behaviour research studies that has emerged from studies using robots with limited abilities that are assigned to following simple tasks [1]. Swarm robotic scenarios are mostly inspired form social behaviour of insects and other animals and there have been many successful implementations of swarm behaviours which have been directly inspired from nature (e.g. honeybees [2], cockroaches [3], ants [4], and birds [5]). As highlighted in [6], one of the main criteria of swarm robotics is operating experiments with a “large number of robots”, typically at least 10 20. Recently, the number of robots used in swarm robotics has increased dramatically with swarm sizes of up to 1000 robots being reported [7]. To implement such large sizes of swarms with commercial robots can therefore be very costly. To tackle this issue, affordable open-source and open-hardware robotic platforms are playing an important role in research and education. Several mobile robots have been developed and successfully deployed in swarm robotic research studies, such as Khepera [8], Alice [9], Jasmine [10], E-puck [11], Colias [12], SwarmBot [13], Kilobot [14], and S-bot [15]. In these studies bio-inspired collective behaviour has been imitated, however, despite this work only a limited number of low-cost, open-source, and open-hardware mobile robots are available for use in swarm robotic research studies. For example, ’Colias’ is an open-source, low-cost mobile robot that was developed for application to swarm scenarios. A large group of Colias robots played the role of young honeybees role to mimic BEECLUST aggregation [16]. Colias has also been utilised to study bio-inspired vision mechanism [17] and artificial pheromone communication system [18]. Recently, Mona has been developed as a low-cost mobile robot for research and education p","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125158163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Smart Contract Model for Agent Societies","authors":"Michele Tumminelli, Steve Battle","doi":"10.31256/ukras17.51","DOIUrl":"https://doi.org/10.31256/ukras17.51","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"20 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120924753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An innovative elbow exoskeleton for stages of post-stroke rehabilitation","authors":"S. Manna, V. Dubey","doi":"10.31256/ukras17.34","DOIUrl":"https://doi.org/10.31256/ukras17.34","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121688753","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A proposed structure to capture the operational and technical capabilities of different robots","authors":"Manal Linjawi, R. Moore","doi":"10.31256/ukras17.41","DOIUrl":"https://doi.org/10.31256/ukras17.41","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134623104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Embodying risk assessment and situational awareness for safe HRI from physical and cognitive control architectures","authors":"Anton Camilleri, Sanya Dogramadzi, P. Caleb-Solly","doi":"10.31256/ukras17.20","DOIUrl":"https://doi.org/10.31256/ukras17.20","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130380052","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Conversational human-swarm interaction using IBM Cloud","authors":"Alan G. Millard, James Williams","doi":"10.31256/ukras17.8","DOIUrl":"https://doi.org/10.31256/ukras17.8","url":null,"abstract":"Swarm robotics is an approach to the coordination of large numbers of robots that has become an increasingly popular field of research in recent years, not least because properly engineered robot swarms are scalable, flexible, and robust, making them an attractive alternative to single-robot systems in many application domains. Since its inception, the field of swarm robotics has grown beyond its roots in purely decentralised control inspired by social insect behaviour, now often utilising hybrid centralised/decentralised control architectures that incorporate human operators who guide swarm actions during tasks such as firefighting, or the localisation of radiation sources. This kind of human-swarm interaction has attracted significant interest from the research community, spawning an entire sub-field of its own that investigates how human operators, supervisors, and team-mates can interact with robot swarms and receive feedback from them. To date, human-swarm control methods such as the use of graphical user interfaces and spatial gestures have received much attention, but there has been little investigation into the potential of controlling swarm robotic systems with an operator’s voice. The few studies that have explored this idea are restricted to the use of specific predefined phrases that the human operator is required to learn, resulting in interactions that are unnatural in comparison to the way a human would normally express themselves in speech. In this paper, we present a novel architecture for conversational human-swarm interaction that addresses these issues, allowing swarm robotic systems to be engineered in such a way that a human operator can guide a swarm using spoken dialogue in a more natural manner.","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130533253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wanlin Li, J. Konstantinova, Y. Noh, A. Alomainy, K. Althoefer
{"title":"Camera-based Flexible Force and Tactile Sensor","authors":"Wanlin Li, J. Konstantinova, Y. Noh, A. Alomainy, K. Althoefer","doi":"10.31256/ukras17.40","DOIUrl":"https://doi.org/10.31256/ukras17.40","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133846225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wei Cheah, P. Green, S. Watson, B. Lennox, F. Arvin
{"title":"Multi-plane Motion Planning for Multi- Legged Robots","authors":"Wei Cheah, P. Green, S. Watson, B. Lennox, F. Arvin","doi":"10.31256/ukras17.22","DOIUrl":"https://doi.org/10.31256/ukras17.22","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132220325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Gupta, Jonathan Byrne, D. Moloney, Hujun Yin, Simon Watson
{"title":"3D Convolutional Neural Networks for Tree Detection using Automatically Annotated LiDAR data","authors":"A. Gupta, Jonathan Byrne, D. Moloney, Hujun Yin, Simon Watson","doi":"10.31256/ukras17.31","DOIUrl":"https://doi.org/10.31256/ukras17.31","url":null,"abstract":"Methods In order to identify trees in LiDAR scans, ground points are first identified and filtered using a Progressive Morphological Filter. This filtered scan is then voxelized in a sparse 3D hierarchical data structure, VOLA (Byrne et al., 2017), in order to reduce the input resolution. A 2 bits per voxel approach is used to encode additional information such as colour, intensity and number of returns information.","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-10-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132756387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}