2018 11th International Workshop on Human Friendly Robotics (HFR)最新文献

筛选
英文 中文
Teleoperation Snake Robots with Panorama Vision 具有全景视觉的遥控蛇机器人
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633487
Long Chen, Xu Yuwen, Zhenshan Bing, Kai Huang, Jun Wang
{"title":"Teleoperation Snake Robots with Panorama Vision","authors":"Long Chen, Xu Yuwen, Zhenshan Bing, Kai Huang, Jun Wang","doi":"10.1109/HFR.2018.8633487","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633487","url":null,"abstract":"Snake robots could be potentially used for exploring constricted situations with its particularly superiority including accessing narrow cavities and climbing pole-like objects. Due to the complexity of external scene, snake robots cannot cope with all complex situations independently. Therefore, we propose a teleoperation snake robots wtih teleoperation mode and autonomy mode using panorama vision. In the teleoperation mode, opretator can interacte with the snake rotot with a VR and a teleoperation device. When the target detected, the sanke robots can be switched to the autonomy mode and accomplish the established task automatically. In order to meet the needs of the both modes, we developed a wireless panorama camera with four lenses, one inertial measurement unit (IMU), one GPS, one WIFI module and one FPGA board for the teleoperation snake robot. Based on this camera, we present a whole pole-climbing solution, including Single Shot multibox Detector (SSD)-based pole detection, panorama SLAM for pole localization and locomotion strategy for the snake robot to move to and climb the poles. Several experimental results demonstrate the practicability of the panorama camera and the effectiveness of the pole-climbing solution.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131273253","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Survey on Intelligent Leather Nesting and Cutting Machine Technology 智能皮革套料切割机技术综述
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633489
Xuexi Zhang, Guokun Lu, Shuting Cai, Zefeng Hu, Xiaoming Xiong, Qinhui Zhao
{"title":"Survey on Intelligent Leather Nesting and Cutting Machine Technology","authors":"Xuexi Zhang, Guokun Lu, Shuting Cai, Zefeng Hu, Xiaoming Xiong, Qinhui Zhao","doi":"10.1109/HFR.2018.8633489","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633489","url":null,"abstract":"Intelligent leather cutting machine is one of the leather manufacturing equipment. It detects and extracts the feature of leather fabric, automatically optimizes leather nesting according to the requirements of the parts, and finally complete the production with CNC(Computerized Numerical Control)cutting. It is beneficial to improve the utilization rate of leather materials and production efficiency as well as reducing labors cost. Based on the development of intelligent leather nesting and cutting technology, this paper introduces the composition of leather cutting machine system, and summarizes and analyzes the leather intelligent nesting algorithm. In above, the trend of leather intelligent nesting and cutting technology is prospected.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125953263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Teleoperation Master-slave Robot Based on Binocular Vision 基于双目视觉的远程操作主从机器人
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633522
Xiaoming Mai, Xu Yuwen, Yang Wang, Long Chen
{"title":"Teleoperation Master-slave Robot Based on Binocular Vision","authors":"Xiaoming Mai, Xu Yuwen, Yang Wang, Long Chen","doi":"10.1109/HFR.2018.8633522","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633522","url":null,"abstract":"Teleoperation master-slave robots could be used for a variaty of dangerous work, such as live work in power systems. It is important for the operator to get the information on site since the operator usually stay away from the scene. Vision as the most important way for humans to access outside information, is also the first choice for remote operators to perceive the surroundings of the salve robot. In order to let the operator sense the live scene better, we developed a visual system with fisheye binocular vision and ordinary binocular vision, which enables the operator to sense the wide-scale environment around the robot through the fisheye camera and to sense the details of the objects through the ordinary binocular camera. In order to solve the problem of large distortion of the fisheye camera, we focus on the correction and interpolation of the fisheye image, and propose a cosine similar interpolation algorithm, which can better recover the missing pixels of the fisheye image and realize the depth estimation. In addition, combined with our vision device and master-slave robot arm, we designed a teleoperation device with better human-computer interaction performance.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116264168","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Visual Perception System for Randomized Picking Task 随机挑选任务的视觉感知系统
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633526
Bo Zhan, Xin Wang, Jingyuan Wu, Shuaishuai Wang, Aizhen Li
{"title":"Visual Perception System for Randomized Picking Task","authors":"Bo Zhan, Xin Wang, Jingyuan Wu, Shuaishuai Wang, Aizhen Li","doi":"10.1109/HFR.2018.8633526","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633526","url":null,"abstract":"Randomized picking is a classical, high practical value, but complex mission for collaborative robots. In this paper we presents an efficient and robust perception system for this task, which is capable of detecting and classifying each instance in occlusion environment as well as outputting the 6D pose of target object for grasping. For system running efficiency, we design a grasping strategy which can automatically select appropriate target object among multiple ones, deal with the situation of object point cloud insufficient in amount of points and correct a particular wrong registration result that violate common sense. A gripper open degree estimation algorithm is also presented so as to prevent fingers colliding with neighbor objects of the target. Finally, to test the effectiveness and robustness of our proposed approaches, we show an experimental result of the whole robot system.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121679460","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Research on Dynamic Stability Prediction of Large Heavy Six-legged Robots On Slope 大型重型六足机器人斜坡动态稳定性预测研究
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633528
F. Zha, Qiming Wang, Penglong Zheng, Chen Chen, Wei Guo
{"title":"Research on Dynamic Stability Prediction of Large Heavy Six-legged Robots On Slope","authors":"F. Zha, Qiming Wang, Penglong Zheng, Chen Chen, Wei Guo","doi":"10.1109/HFR.2018.8633528","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633528","url":null,"abstract":"The determination of stability is critical for heavy six-legged robots, as heavy six-legged robots will be difficult to recover if they become unstable. This paper mainly describe a method to predict the robot's stability on slopes based on the robot's dynamic stability and the terrain information of the ground. In order to conveniently and rationally use topographic information to make reasonable predictions of some instability of robots, such as slipping and destabilizing phenomena, fuzzy inference and input of topographic information are used to predict robot's stability. Finally, the algorithm is verified in a simulation environment","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"167 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120841021","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Laser Point Detection Based on Improved Target Matching Method for Application in Home Environment Human-Robot Interaction 基于改进目标匹配方法的激光点检测在家庭环境人机交互中的应用
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633527
Yaxin Liu, Yanqiang Zhang, Yufeng Yao, Ming Zhong
{"title":"Laser Point Detection Based on Improved Target Matching Method for Application in Home Environment Human-Robot Interaction","authors":"Yaxin Liu, Yanqiang Zhang, Yufeng Yao, Ming Zhong","doi":"10.1109/HFR.2018.8633527","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633527","url":null,"abstract":"As the population aging becomes more severe, the development of Wheelchair Mounted Robotic Arms (WMRA) has gained greater attention. There are remaining issues that haven't been properly tackled, such as human-robot interaction and real-time performance. In this paper, laser pointer is used to facilitate the interaction between human and the WMRA, and an improved target matching method is proposed for laser point detection in home environment. Firstly, the laser point's characteristics are amplified through the use of channel separation technique and reflective materials. Then, the laser point is separated from the image using background difference method. Finally, through ASUS Xtion, the distance between laser point and the centroid coordination of object is calculated, and then Kinova Jaco robotic arm is used for grasping. The experimental result shows that the algorithm can effectively detect laser point in the home environment, and as a human-robot interaction, the robotic arm successfully completes the task of demo grasp.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"72 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121001691","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Design of Control System for Educational Robot with Six-Degree Freedom 六自由度教育机器人控制系统设计
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633500
Gao Hewei, Li Xingdong, Wang Yangwei
{"title":"Design of Control System for Educational Robot with Six-Degree Freedom","authors":"Gao Hewei, Li Xingdong, Wang Yangwei","doi":"10.1109/HFR.2018.8633500","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633500","url":null,"abstract":"In order to develop a friendlier human-machine interface, more precise control of the six degree of freedom robot will be achieved. The manipulator is driven by PLC which is programmed in TIA Portal V14, and PLC is responsible to keeping up with the upper computer. The communication mode is TCP/IP communication protocol, and the upper computer is developed by using Socket technology under C++MFC environment. Kinematics analysis of the manipulator is carried out by D-H parameter method and embedded into the upper computer program [9]. Finally, the absolute positioning control of the manipulator is realized. Through the visualized human-computer interface, the motion process and control principle of learning six degrees of freedom robot can be understood in a more depth.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127556556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
People Following System Based on LRF 基于LRF的人员跟踪系统
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633507
Zuoquan Zhao, Chenglei Fang, Qinyuan Ren
{"title":"People Following System Based on LRF","authors":"Zuoquan Zhao, Chenglei Fang, Qinyuan Ren","doi":"10.1109/HFR.2018.8633507","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633507","url":null,"abstract":"One main feature of differential driven mobile robot systems is to follow the target person automatically with low computational complexity. In this paper, we propose a simple, effective and robust target person following system based on Laser Range Finder(LRF). Firstly, by using a dynamic threshold based on measurement distance and measurement error of LFR, we cluster the raw data of LRF. Then the accurate position information is obtained by exclusion method and time information. Depending on the geometric characteristics of human legs, the processes of exclusion method obtains candidate clusters which may represent target person. The final target person location is further picked from candidate clusters by time information. Finally, several PID controllers are adopted to drive the mobile robot and make mobile robot following target person steadily and robustly. The people following method implement in Robot Operating System (ROS) and our own mobile robot. The experimental results verify the effectiveness, robustness and accuracy of our proposed method.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125069678","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
The Research of Depth Perception Method Based on Sparse Random Grid 基于稀疏随机网格的深度感知方法研究
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633523
Dongxue Li, Fang Xu, Fengshan Zou, P. Di, Hongyu Wang
{"title":"The Research of Depth Perception Method Based on Sparse Random Grid","authors":"Dongxue Li, Fang Xu, Fengshan Zou, P. Di, Hongyu Wang","doi":"10.1109/HFR.2018.8633523","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633523","url":null,"abstract":"In this paper, we propose a high-resolution depth sensing method based on structured light. In 3D contour scanning, passive binocular stereo vision is difficult to obtain enough 3D information for objects with inconspicuous surface features. To solve this problem, based on the binocular stereo vision principle and the structured light projection method in active vision, a method of obtaining sparse depth based on random mesh is proposed. Four templates are projected onto the surface of the object, which are random meshes and three templates with phase difference. In addition, the relative phase image is calculated according to the three-step phase-shifting mode. Finally, the depth map calculated in the conventional structured light method. In this paper, we demonstrate the effectiveness algorithm to show that our depth sensing are more accurate and resolution than the existing methods in the experiments.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116267825","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mechatronic Design and Control of a 3D Printed Low Cost Robotic Upper Limb 3D打印低成本机器人上肢机电一体化设计与控制
2018 11th International Workshop on Human Friendly Robotics (HFR) Pub Date : 2018-11-01 DOI: 10.1109/HFR.2018.8633519
Duncan Carter-Davies, Junshen Chen, Fei Chen, Miao Li, Chenguang Yang
{"title":"Mechatronic Design and Control of a 3D Printed Low Cost Robotic Upper Limb","authors":"Duncan Carter-Davies, Junshen Chen, Fei Chen, Miao Li, Chenguang Yang","doi":"10.1109/HFR.2018.8633519","DOIUrl":"https://doi.org/10.1109/HFR.2018.8633519","url":null,"abstract":"Robots and robotic technologies are changing how society functions. Found throughout industry, education, and more recently consumers homes, robotic devices perform and assist with large varieties of tasks. Robotic limbs allow users to regain or enhance their abilities, notably when completing challenging tasks, ultimately leading to greater productivity with less stain on the user. However, robotic limbs tend to be very costly, a factor that limits the availability of current robotic limbs. In this paper, we have developed an inexpensive 3D printed, 4-degrees of freedom, robotic upper limb. An Arduino microcontroller, electromechanical actuators and additional electronics were integrated to and assembled in to the 3D printed chassis, forming a functional contained system. The assembled robotic upper limb, with potential for use as a supernumerary robotic limb or transhumeral prosthesis for both disabled and non disabled person to use, was then generally tested for appropriate function. Discussion of how this design can be improved leads to the conclusion that 3D printing and inexpensive components certainly pose potential for use in affordable robotic limbs, with the prototype limb being proclaimed a successful foundation.","PeriodicalId":263946,"journal":{"name":"2018 11th International Workshop on Human Friendly Robotics (HFR)","volume":"170 5 Pt 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116388648","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信