UK-RAS Conference: Robots Working For and Among Us Proceedings最新文献

筛选
英文 中文
Wireless Power Transfer for Gas Pipe Inspection Robots 用于燃气管道检测机器人的无线电力传输
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.26
Doychinov, B. Kaddouh, G. Mills, B. T. Malik, N. Somjit, I. Robertson
{"title":"Wireless Power Transfer for Gas Pipe Inspection Robots","authors":"Doychinov, B. Kaddouh, G. Mills, B. T. Malik, N. Somjit, I. Robertson","doi":"10.31256/ukras17.26","DOIUrl":"https://doi.org/10.31256/ukras17.26","url":null,"abstract":"Wireless power transfer in metal pipes is a promising alternative to tethered exploration robots, with strong potential to enable longer operating times. Here we present experimental results, including rectification efficiency, for a prototype gas pipe inspection robot with wireless power receiver functionality.","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"136 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131207020","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Robin: An Autonomous Robot for Diabetic Children 罗宾:糖尿病儿童的自主机器人
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.5
Matthew Lewis, Cognition Embodied Emotion, Lola Cañamero
{"title":"Robin: An Autonomous Robot for Diabetic Children","authors":"Matthew Lewis, Cognition Embodied Emotion, Lola Cañamero","doi":"10.31256/ukras17.5","DOIUrl":"https://doi.org/10.31256/ukras17.5","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"175 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134449961","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Motor Imagery Classification based on RNNs with Spatiotemporal-Energy Feature Extraction 基于时空能量特征提取的rnn运动图像分类
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.55
D-D Zhang, Jianlong Zheng, J. Fathi, M. Sun, F. Deligianni, G. Yang
{"title":"Motor Imagery Classification based on RNNs with Spatiotemporal-Energy Feature Extraction","authors":"D-D Zhang, Jianlong Zheng, J. Fathi, M. Sun, F. Deligianni, G. Yang","doi":"10.31256/ukras17.55","DOIUrl":"https://doi.org/10.31256/ukras17.55","url":null,"abstract":"With the recent advances in artificial intelligence and robotics, Brain Computer Interface (BCI) has become a rapidly evolving research area. Motor imagery (MI) based BCIs have several applications in neurorehabilitation and the control of robotic prosthesis because they offer the potential to seamlessly translate human intentions to machine language. However, to achieve adequate performance, these systems require extensive training with high-density EEG systems even for two-class paradigms. Effectively extracting and translating EEG data features is a key challenge in Brain Computer Interface (BCI) development. This paper presents a method based on Recurrent Neural Networks (RNNs) with spatiotemporal-energy feature extraction that significantly improves the performance of existing methods. We present cross-validation results based on EEG data collected by a 16-channel, dry electrodes system to demonstrate the practical use of our algorithm. Introduction Robotic control, based on brainwave decoding, can be used in a range of scenarios including patients with locked-in syndrome, rehabilitation after a stroke, virtual reality games and so on. In these cases, subjects may not be able to move their limbs. For this reason, the development of MI tasks based BCI is very important [1]. During a MI task, the subjects imagine moving a specific part of their body without initiating the actual movements. This process involves the brain networks, which are responsible for motor control similarly to the actual movements. Decoding brain waves is challenging, since EEG signals have limited spatial resolution and low signal to noise ratio. Furthermore, experimental conditions, such as subjects’ concentration and prior experience with BCI can bring confounds to the results. Thus far, several approaches have been proposed to classify MI tasks based data but their performances are limited even for the two-class paradigms that involve left and right hand MI tasks [2]. EEG-based BCI normally involves noise filtering, feature extraction and classification. Brain signals are normally analysed in cue-triggered or stimulus-triggered time windows. Related methods include identifying changes in Event Potentials (EPs), slow cortical potentials shifts, quantify oscillatory EEG components and so on [3]. These types of BCI are operated with predefined time windows. Furthermore, the interand intra-subject variability cannot be overlooked when finding suitable feature representation model. Recently, Deep Neural Networks (DNNs) have emerged with promising results in several applications. Their adaptive nature allows them to automatically extract relevant features from data without extensive preprocessing and prior knowledge about the signals [4]. Convolutional Neural Networks (CNNs) have been used to classify EEG features by transforming the temporal domain into spatial domain [5]. However, the CNN structure is static and inherently not suitable for processing temporal patterns. F","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"204 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127046074","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Assessing Pose Features for Predicting User Intention during Dressing with Deep Networks 用深度网络评估姿态特征预测用户穿衣意图
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/UKRAS17.1
Greg Chance, A. Jevtić, P. Caleb-Solly, G. Alenyà, S. Dogramadzi
{"title":"Assessing Pose Features for Predicting User Intention during Dressing with Deep Networks","authors":"Greg Chance, A. Jevtić, P. Caleb-Solly, G. Alenyà, S. Dogramadzi","doi":"10.31256/UKRAS17.1","DOIUrl":"https://doi.org/10.31256/UKRAS17.1","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"13 4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123646982","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A practical mSVG interaction method for patrol, search, and rescue aerobots 用于巡逻、搜索和救援飞行器的实用mSVG交互方法
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.13
A. Abioye, S. Prior, T. G. Thomas, P. Saddington, S. Ramchurn
{"title":"A practical mSVG interaction method for patrol, search, and rescue aerobots","authors":"A. Abioye, S. Prior, T. G. Thomas, P. Saddington, S. Ramchurn","doi":"10.31256/ukras17.13","DOIUrl":"https://doi.org/10.31256/ukras17.13","url":null,"abstract":"This paper briefly presents the multimodal speech and visual gesture (mSVG) control for aerobots at higher nCA autonomy levels, using a patrol, search, and rescue application example. The developed mSVG control architecture was presented and briefly discussed. This was successfully tested using both MATLAB simulation and python based ROS Gazebo UAV simulations. Some limitations were identified, which formed the basis for the further works presented.","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129163730","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A Modified Computed Torque Control Approach for a Teleoperation Master- Slave Robot Manipulator System 一种改进的遥操作主从机器人系统转矩计算控制方法
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.10
Ololade O. Obadina, J. Bernth, K. Althoefer, M. Shaheed
{"title":"A Modified Computed Torque Control Approach for a Teleoperation Master- Slave Robot Manipulator System","authors":"Ololade O. Obadina, J. Bernth, K. Althoefer, M. Shaheed","doi":"10.31256/ukras17.10","DOIUrl":"https://doi.org/10.31256/ukras17.10","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"703 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116423558","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
An Optimised Deep Neural Network Approach for Forest Trail Navigation for UAV Operation within the Forest Canopy 基于优化深度神经网络的无人机林下航迹导航
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.7
Bruna G. Maciel-Pearson, Pratrice Carbonneu, T. Breckon
{"title":"An Optimised Deep Neural Network Approach for Forest Trail Navigation for UAV Operation within the Forest Canopy","authors":"Bruna G. Maciel-Pearson, Pratrice Carbonneu, T. Breckon","doi":"10.31256/ukras17.7","DOIUrl":"https://doi.org/10.31256/ukras17.7","url":null,"abstract":"Autonomous flight within a forest canopy represents a key challenge for generalised scene understanding \u0000on-board a future Unmanned Aerial Vehicle (UAV) platform. Here we present an approach for automatic \u0000trail navigation within such an environment that successfully generalises across differing image resolutions - \u0000allowing UAV with varying sensor payload capabilities to operate equally in such challenging environmental \u0000conditions. Specifically, this work presents an optimised deep neural network architecture, capable of stateof-the-art \u0000performance across varying resolution aerial UAV imagery, that improves forest trail detection for \u0000UAV guidance even when using significantly low resolution images that are representative of low-cost search \u0000and rescue capable UAV platforms.","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124311266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
On Decision-making for Computation Offloading in Cloud-assisted Autonomous Vehicle Systems 云辅助自动驾驶车辆系统计算卸载决策研究
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/ukras17.6
Yi Lu, C. Maple, T. Sheik, M. Dianati, A. Mouzakitis
{"title":"On Decision-making for Computation Offloading in Cloud-assisted Autonomous Vehicle Systems","authors":"Yi Lu, C. Maple, T. Sheik, M. Dianati, A. Mouzakitis","doi":"10.31256/ukras17.6","DOIUrl":"https://doi.org/10.31256/ukras17.6","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115470036","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Geographies of Robotization and Automation 机器人化与自动化地理学
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-12 DOI: 10.31256/UKRAS17.4
M. Kovacic, A. Lockhart
{"title":"Geographies of Robotization and Automation","authors":"M. Kovacic, A. Lockhart","doi":"10.31256/UKRAS17.4","DOIUrl":"https://doi.org/10.31256/UKRAS17.4","url":null,"abstract":"","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133298993","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Wireless Communications in Nuclear Decommissioning Environments 核退役环境中的无线通信
UK-RAS Conference: Robots Working For and Among Us Proceedings Pub Date : 2017-12-04 DOI: 10.31256/ukras17.23
A. Buono, Neil Cockbain, Peter T. Green, B. Lennox
{"title":"Wireless Communications in Nuclear Decommissioning Environments","authors":"A. Buono, Neil Cockbain, Peter T. Green, B. Lennox","doi":"10.31256/ukras17.23","DOIUrl":"https://doi.org/10.31256/ukras17.23","url":null,"abstract":"The use of Wireless Sensor Networks (WSN) is now widespread, with well-documented deployments across a diverse range of sectors including aerospace, agri-science and consumer electronics. In the nuclear industry there have been successful deployments of the WSN technologies for instrumentation and control, however, there are significant challenges that need to be addressed before wireless sensing can be used in nuclear decommissioning environments. These challenges include: limited sources of power; the radiation tolerance of the sensor and communication system components; the severe attenuation of wireless signals through reinforced wall structures; and the need to deliver secure, interoperable and reliable communication. Introduction Robotics and Automation applications within the nuclear decommissioning industry are rapidly increasing to reduce the cost, time and dose exposure of workers [1]. In addition, there is the need to store nuclear waste and monitoring the condition of the packages in the stores [2]. The design, prototype and evaluation of Wireless Sensor Network with the capability to deliver remote sensing and control can result in reduction of the cost and time to install robotics application and improve the performance, collecting data from hard to reach places not designed to be decommissioned. As a result a successful application can lead to an increase of robotics and automation in the nuclear Industry. Benefits and Challenges Wireless Sensor Networks are extensively employed in agriculture, classic examples are application to monitor soil and crop properties [3] [4]. Similarly in the aerospace industry it is possible to find useful example of Wireless Sensors Networks in harsh environments, to monitor gas turbine engines [5]. In the nuclear industry there have been initiatives to deploy Commercial Off The Shelf (COTS) wireless instrumentation and control systems [6]. One such initiative resulted in Sellafield’s first application of this technology [7], with reported time saving of 16 weeks and a cost saving of £185k. However, there remain a number of significant challenges to address if Wireless Sensor Networks are to be deployed in nuclear decommissioning environments. One key challenge is the damaged to COTS integrated circuits caused by the high radiation levels and elevated temperatures. There are also fundamental communication challenges resulting from the very high signal attenuation experienced by Radio Frequency (RF) signals propagating through reinforced concrete wall and floor structures. In addition, many legacy buildings in nuclear facilities were not designed to be decommissioning, and limited access and unknown conditions are a further problem. In these situations the wireless sensing systems will need to be battery-powered, with the possibility of power harvesting. Wireless Sensor Network for Nuclear Decommissioning Industry A research project, sponsored by the Centre for Innovative Nuclear Decommissioning (CINDe","PeriodicalId":392429,"journal":{"name":"UK-RAS Conference: Robots Working For and Among Us Proceedings","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125229796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信