2018 15th International Conference on Ubiquitous Robots (UR)最新文献

筛选
英文 中文
Linkage-Type Walking Mechanism for Unstructured Vertical Wall 连杆式非结构垂直墙体行走机构
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441823
Yanheng Liu, Taewon Seo
{"title":"Linkage-Type Walking Mechanism for Unstructured Vertical Wall","authors":"Yanheng Liu, Taewon Seo","doi":"10.1109/URAI.2018.8441823","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441823","url":null,"abstract":"This paper presents a new wall-climbing robot design equipped with a linkage-type walking mechanism for unstructured vertical surface navigation. A vertical unstructured surface limits the wall-climbing performance of a robot, and hence, becomes a crucial factor that necessitates research in the field of vertical wall-climbing robots. Therefore, to respond to this problem, a wall-climbing robot based on a four-bar mechanism and repeated walking locomotion via a single actuator, was studied. This feature used a symmetric linkage design to adapt to uneven surfaces in the process of overcoming obstacles with compliant motion. Moreover, for the robot's wall surface attachment, the design mechanism employed is the flat dry adhesion method. Additionally, based on the analyses of the robot's kinematics and statics, essential design parameters were determined and used in the robot's configuration. Experimental evidence confirmed that this vertical wall climbing robot can hurdle a 10-mm step, which is 100% of the stride height and 20% of the robot's height based on linkage walking mechanism design. Consequently, with such an ability, this vertical wall-climbing robot with a linkage type design can access areas wider than what commonly designed wall-climbing machines can cover.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127024104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Robotic Companions for Long Term Isolation Space Missions 用于长期隔离太空任务的机器人同伴
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441838
S. Engler, J. Hunter, K. Binsted, H. Leung
{"title":"Robotic Companions for Long Term Isolation Space Missions","authors":"S. Engler, J. Hunter, K. Binsted, H. Leung","doi":"10.1109/URAI.2018.8441838","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441838","url":null,"abstract":"We tested robotic companions for their ability to reduce stress and create emotional bonds in the event of a Mars-analog space mission. Two different robotic companion platforms were programmed with aggressive and passive personalities and given to crew members for a three-day evaluation. Surveys and feedback from crew members were used to evaluate the effectiveness of the robots? ability to reduce stress. Personality traits were examined in terms of the robots? ability to create positive interactions with the crew. Utilizing this information, a new behavioral model is proposed that will provide sufficient complexity and adaptability for a robot companion to interact successfully with humans, and to create emotional bonds and mitigate distress in crew members.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130694861","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Shared Teleoperation for Nuclear Plant Robotics Using Interactive Virtual Guidance Generation and Shared Autonomy Approaches 基于交互虚拟引导生成和共享自治方法的核电厂机器人共享遥操作
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441814
Vitalii Pruks, Kwang-Hyun Lee, J. Ryu
{"title":"Shared Teleoperation for Nuclear Plant Robotics Using Interactive Virtual Guidance Generation and Shared Autonomy Approaches","authors":"Vitalii Pruks, Kwang-Hyun Lee, J. Ryu","doi":"10.1109/URAI.2018.8441814","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441814","url":null,"abstract":"To perform the complex tasks with telerobot system in cluttered environments such as disaster, underwater, space, the necessity of shared teleoperation is increased. However, designing an autonomy while keeping in mind the environmental uncertainty is challenging. This paper proposes shared teleoperation framework and virtual guidance generation for a human interactive shared teleoperation. The major concept of both shared teleoperation and virtual guidance generation is utilizing the cognitive ability of a human operator. In shared teleoperation case, the human intention information is provided to the autonomous system in the form of a path which is acquired by our proposed sketch method. Our novel proposed sketch method allows the operator to sketch the path on the visual feedback image of the remote environments and the information is then transmitted to the autonomous system. In virtual guidance generation, we employ human cognitive abilities to assist the computer to generate virtual guidance for teleoperation tasks. The resulting both systems are more efficient and less fatigued than direct teleoperation.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"4 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114128659","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Research on a Precise Grinding Device for Servo Valve Spool 伺服阀芯精密磨削装置的研究
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441901
G. Wang, Yu-xin Zhang, Xudong Pan
{"title":"Research on a Precise Grinding Device for Servo Valve Spool","authors":"G. Wang, Yu-xin Zhang, Xudong Pan","doi":"10.1109/URAI.2018.8441901","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441901","url":null,"abstract":"The grinding of valve core and valve sleeve plays a key part in the manufacturing process of electro-hydraulic servo valve, and the grinding precision of the lap matching is directly related to the accuracy of the servo valve grinding. The objective of this paper is to develop a set can be used for servo valve spool throttle sub-micron precision grinding device which can solve tool setting error, high interference and low efficiency according to theory of acoustic emission and microfeed technology. An overall design scheme of using the acoustic emission technique and the motor driving hand wheel to set tool automatically, and the piezoelectric ceramic for micro feeding is proposed. The test results show that the grinding results are in accordance with the design requirements.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"104 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114863120","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
User Interface Design for Semi-Autonomous Teleoperation of Manipulator-Stage System on Flexible Beam 柔性梁上机械手平台半自主遥操作的用户界面设计
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441770
Hackchan Kim, ChangSu Ha, Joonmo Ahn, Jeongseob Lee, Dongjun Lee
{"title":"User Interface Design for Semi-Autonomous Teleoperation of Manipulator-Stage System on Flexible Beam","authors":"Hackchan Kim, ChangSu Ha, Joonmo Ahn, Jeongseob Lee, Dongjun Lee","doi":"10.1109/URAI.2018.8441770","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441770","url":null,"abstract":"A novel user interface design is presented for semi-autonomous teleoperation system consisting of 3-DOF manipulator, 1-DOF linear stage, and flexible beam erected vertically. An LQR(Linear Quadratic Regulator) control designed for vibration suppression of the beam is applied to the slave-robot system autonomously, and a 3-DOF manipulator performs peg-in-hole tasks by the master in a remote place through inverse kinematics-based workspace tracking control. 3D vision transmitted to the master via HMD(Head Mounted Display) device and haptic feedback calculated from contact information are main components of our interface. From the perspective of human perception, optimization for 3D camera placement is performed to maximize human work efficiency. To verify the proposed user interface performance, human subject tests are performed for eight participants by comparing our interface with a 2D webcam interface and/or with a non-haptic feedback interface. Furthermore, in order to make it possible to run the robotic system in any environment, system on-board implementation by using three strain gauges, one IMU (Inertial Measurement Unit), and encoders of actuators is progressed in this paper.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121564959","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Optical Coherence Tomography Image Segmentation for Cornea Surgery using Deep Neural Networks 基于深度神经网络的角膜手术光学相干断层图像分割
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441889
Young Jin Heo, Ikjong Park, K. H. Kim, Myoung-Joon Kim, W. Chung
{"title":"Optical Coherence Tomography Image Segmentation for Cornea Surgery using Deep Neural Networks","authors":"Young Jin Heo, Ikjong Park, K. H. Kim, Myoung-Joon Kim, W. Chung","doi":"10.1109/URAI.2018.8441889","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441889","url":null,"abstract":"This paper describes use of deep neural networks for semantic segmentation of optical coherence tomography (OCT) images to accurately predict segmentation masks from noisy and occluded OCT images. The OCT images and semantic masks are acquired and commercial surgical tools, from an ex-vivo porcine eye. Simple post-processing can compute needle tip position and insertion depth from the predicted semantic masks. The segmentation accuracy, needle tip position error, and insertion depth error obtained from the FCN-8s, dilated convolution, and U-Net were compared. U-Net achieved the highest accuracy in the presence of occlusion and object overlap (81.5% mean IoU; 30.0-ILm tip-position error). The results show that the OCT image segmentation can be applied to the development of a surgical robot for corneal suturing.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115113855","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Development of Automatic Virtual Guidance Visualization System for Teleoperation Using Object Detection 基于目标检测的远程操作自动虚拟导引可视化系统的开发
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441808
Kyunghwan Cho, K. Ko, Heereen Shim, Inhoon Jang
{"title":"Development of Automatic Virtual Guidance Visualization System for Teleoperation Using Object Detection","authors":"Kyunghwan Cho, K. Ko, Heereen Shim, Inhoon Jang","doi":"10.1109/URAI.2018.8441808","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441808","url":null,"abstract":"This paper proposes automatically generate a virtual guidance for teleoperation system using object detection. A virtual guidance that gives force feedback when performing a mission like a peg-in-hole using a slave robot with a haptic device is very helpful for the operator. The key point of our study is to automatically generate this virtual guidance using deep learning architecture. If the operator uses this information, there is no need to generate a virtual guidance one by one. In addition, our system can automatically and continuously generate virtual guidance in moving situations as well as stationary situations. The position of the target object is recognized using deep learning architecture and depth information. And visual information of the virtual guidance is visualized using the virtual environment visualization system. Our system helps the operator to recognize spatial information.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115500909","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Designing an Autonomous Vehicle Lean Recovery System for Motorcycles 摩托车自动车辆精益回收系统设计
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8442205
Alberto Pérez, G. Hernandez, Naomi Folta, R. Regalado, Shawn McElwain, D. Lofaro
{"title":"Designing an Autonomous Vehicle Lean Recovery System for Motorcycles","authors":"Alberto Pérez, G. Hernandez, Naomi Folta, R. Regalado, Shawn McElwain, D. Lofaro","doi":"10.1109/URAI.2018.8442205","DOIUrl":"https://doi.org/10.1109/URAI.2018.8442205","url":null,"abstract":"Motorcycle accidents can cause property damage, bodily injury and even death. The purpose of the vehicle lean recovery system is to introduce a level of safety and security associated with riding motorcycles not previously in place. Three preliminary designs, capable of preventing a motorcycle from rolling over, were developed and tested. These designs were developed by mathematical modeling, 3D simulation and CAD modeling, cost analysis, preliminary calculation of required forces, power requirement, weight of components, and quantitative estimations of feasible range and performance. These quantitative metrics were compared between the various designs and used to create a weighted decision matrix. Finally a prototype was manufactured and tested.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130152825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Research on Fault Diagnosis of Rotational Automaton Based on VMD-ELM 基于VMD-ELM的旋转自动机故障诊断研究
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441863
First A. Pan Mingzhi, Pan Hong-xia, Second B. Xu Xin, Liu Huiling
{"title":"Research on Fault Diagnosis of Rotational Automaton Based on VMD-ELM","authors":"First A. Pan Mingzhi, Pan Hong-xia, Second B. Xu Xin, Liu Huiling","doi":"10.1109/URAI.2018.8441863","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441863","url":null,"abstract":"Due to complex operating environment of automat, superposition of various response signals, in order to accurately, efficiently extract fault characteristics of automat signal, a automat fault analysis method using VMD and ELM was proposed. First automat signal was analyzed using VMD and compared with the result of EMD; meanwhile energy percentage of every modal component and sample entropy of different samples under various operating condition were extracted as characteristic values; extracted characteristic values were input into ELM for fault diagnosis and compared with the diagnostic result of traditional double-spectrum analysis. Finally, VMD method achieved adaptive subdivision of every component in frequency domain of signal and concluded that accuracy rate of ELM fault diagnosis is 87.5%. result of the experiment showed that VMD can effectively avoid mode aliasing and test feasibility and effectiveness of proposed method.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128718104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fuzzy Impedance Control for Enhancing Capabilities of Humans in Onerous Tasks Execution 模糊阻抗控制提高人类执行繁重任务的能力
2018 15th International Conference on Ubiquitous Robots (UR) Pub Date : 2018-06-01 DOI: 10.1109/URAI.2018.8441800
L. Roveda, Shaghavezh Haghshenas, A. Prini, T. Dinon, N. Pedrocchi, F. Braghin, L. M. Tosatti
{"title":"Fuzzy Impedance Control for Enhancing Capabilities of Humans in Onerous Tasks Execution","authors":"L. Roveda, Shaghavezh Haghshenas, A. Prini, T. Dinon, N. Pedrocchi, F. Braghin, L. M. Tosatti","doi":"10.1109/URAI.2018.8441800","DOIUrl":"https://doi.org/10.1109/URAI.2018.8441800","url":null,"abstract":"Human-robot cooperation is increasingly demanded in industrial applications. Many tasks require the robot to enhance the capabilities of humans, allowing them to execute onerous tasks or improving their functionalities. Besides wearable robotics, standard industrial manipulators are common solutions adopted to empower humans. With this aim, the paper describes a fuzzy-impedance control approach for assisting human operator in onerous industrial applications. The developed method allows to set in realtime the set-point of impedance controller based on human intentions. Two fuzzy membership functions have been defined, respectively, on the basis of interaction force derivative and robot velocity signals, allowing to calculate the needed assistance level for the human operator. The effectiveness of proposed approach is verified on KUKA iiwa 14 R820 in an experimental procedure where human subjects evaluate the robot's performance in a collaborative lifting task.","PeriodicalId":347727,"journal":{"name":"2018 15th International Conference on Ubiquitous Robots (UR)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127891144","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 24
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信