2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)最新文献

筛选
英文 中文
Foreground segmentation with efficient selection from ICP outliers in 3D scene 3D场景中有效选择ICP离群点的前景分割
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7418962
H. Sahloul, J. Heredia, Shouhei Shirafuji, J. Ota
{"title":"Foreground segmentation with efficient selection from ICP outliers in 3D scene","authors":"H. Sahloul, J. Heredia, Shouhei Shirafuji, J. Ota","doi":"10.1109/ROBIO.2015.7418962","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7418962","url":null,"abstract":"Foreground segmentation enables dynamic reconstruction of the moving objects in static scenes. After KinectFusion had proposed a novel method that constructs the foreground from the Iterative Closest Point (ICP) outliers, numerous studies proposed filtration methods to reduce outlier noise. To this end, the relationship between outliers and the foreground is investigated, and a method to efficiently extract the foreground from outliers is proposed. The foreground is found to be directly connected to ICP distance outliers rather than the angle and distance outliers that have been used in past research. Quantitative results show that the proposed method outperforms prevalent foreground extraction methods, and attains an average increase of 11.8% in foreground quality. Moreover, real-time speed of 50 fps is achieved without heavy graph-based refinements, such as GrabCut. The proposed depth features surpass current 3D GrabCut, which only uses RGB-N.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133703306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Gait planning and control for biped robots based on modifiable key gait parameters from human motion analysis 基于人体运动分析的关键步态参数可修改的双足机器人步态规划与控制
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7418864
Hongbo Zhu, Minzhou Luo, T. Mei, Tao Li
{"title":"Gait planning and control for biped robots based on modifiable key gait parameters from human motion analysis","authors":"Hongbo Zhu, Minzhou Luo, T. Mei, Tao Li","doi":"10.1109/ROBIO.2015.7418864","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7418864","url":null,"abstract":"In this paper, effective gait planning and control is established for biped robots. First, an experiment of human locomotion is carried out using a motion capture system for analysis of human gait features. We found modifiable key gait parameters affecting the dominant performance of biped robots walking from extracted features. Then, we proposed an effective bio-inspired gait planning (BGP) algorithm using modifiable key gait parameters. In final, the proposed method has been verified through simulations and experiment in a real biped robot DRC-XT.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133952492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Online and incremental contextual task learning and recognition for sharing autonomy to assist mobile robot teleoperation 基于共享自主性的在线和增量上下文任务学习和识别辅助移动机器人遥操作
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419076
Ming Gao, T. Schamm, Johann Marius Zöllner
{"title":"Online and incremental contextual task learning and recognition for sharing autonomy to assist mobile robot teleoperation","authors":"Ming Gao, T. Schamm, Johann Marius Zöllner","doi":"10.1109/ROBIO.2015.7419076","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419076","url":null,"abstract":"This contribution proposes a fast online approach to learn and recognize the contextual tasks incrementally, with the aim of assisting mobile robot teleoperation by efficiently facilitating autonomy sharing, which improves our previous approach, where a batch mode was adopted to obtain the model for task recognition. We employ a fast online Gaussian Mixture Regression (GMR) model combined with a recursive Bayesian filter (RBF) to infer the most probable contextual task the human operator executes across multiple candidate targets, which is capable of incorporating demonstrations incrementally. The overall system is evaluated with a set of tests in a cluttered indoor scenario and shows good performance.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132208620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A novel robot hand with the magneto-rheological fluid solidification 一种新型的具有磁流变流体固化的机械手
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419714
Qingyun Liu, Tiantian Jing, An Mo, Xiangrong Xu, Wenzeng Zhang
{"title":"A novel robot hand with the magneto-rheological fluid solidification","authors":"Qingyun Liu, Tiantian Jing, An Mo, Xiangrong Xu, Wenzeng Zhang","doi":"10.1109/ROBIO.2015.7419714","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419714","url":null,"abstract":"The conventional passively underactuated hand can self-adaptively grasp an object under the reaction force produced by other active joints or the grasped objects, but it may reject the object if the force disappears, and cannot grasp independently. In order to overcome this serious disadvantage, a novel kind design of the passively self-adaptive underactuated hand is proposed, called the magneto-rheological fluid (MRF) hand. The MRF can be instantaneously solidified while a fitful magnetic field being produced, and liquidized shortly after the magnetic field disappearing. Based on this characteristic, the MRF is applied to a self-adaptive hand which can solidify the shape of the joint grasping the object and keep the grasping force under the help of springs. The MRF Hand is actuated initially by the reaction force from the grasped object and locked by the solidified MRF ultimately. The MRF Hand can keep the shape of the grasping object securely during the grasping process.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134337871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Non-binding lower extremity exoskeleton (NextExo) for load-bearing 用于承重的非绑定下肢外骨骼(NextExo)
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419119
Du-Xin Liu, Xinyu Wu, Min Wang, Chunjie Chen, Ting Zhang, Ruiqing Fu
{"title":"Non-binding lower extremity exoskeleton (NextExo) for load-bearing","authors":"Du-Xin Liu, Xinyu Wu, Min Wang, Chunjie Chen, Ting Zhang, Ruiqing Fu","doi":"10.1109/ROBIO.2015.7419119","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419119","url":null,"abstract":"In this paper, we present a novel non-binding lower extremity exoskeleton (NextExo) for bearing load, where there is no binding point between the NextExo and human. With the innovative structure, the NextExo is able to stand in balance without attaching human, and bear the weights of its own and load completely. This also avoids the damage to operator caused by long-time binding. The NextExo has eight degrees of freedom, all of which are active joints powered by hydraulic actuators. It shadows human motion by one-to-one joints mapping. The man is as the core in the system to keep the NextExo in balance. Meanwhile, the constraint based on Zero Moment Point theory is adopted. The design concept, hardware structure, control scheme and preliminary experiments of NextExo are discussed.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"271 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134555484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A robotic hand-arm teleoperation system using human arm/hand with a novel data glove 一种利用人的手臂/手和一种新型数据手套的机器人手臂远程操作系统
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419712
Bin Fang, Di Guo, F. Sun, Huaping Liu, Yupei Wu
{"title":"A robotic hand-arm teleoperation system using human arm/hand with a novel data glove","authors":"Bin Fang, Di Guo, F. Sun, Huaping Liu, Yupei Wu","doi":"10.1109/ROBIO.2015.7419712","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419712","url":null,"abstract":"Data glove is one of the most commonly used techniques in the robotic teleoperation systems. In this paper, we propose a robotic hand-arm teleoperation system with a novel data glove called YoBu, which can acquire human motions from both the arm and the hand simultaneously. The proposed data glove is designed to be stable, compact and portable. It is composed of eighteen low-cost inertial and magnetic measurement units, among which fifteen units are attached to the human operator's finger joints for robotic hand teleoperation and three units are attached to the palm, upper arm and forearm respectively for robotic arm teleoperation. In the robotic hand-arm teleoperation system, the operating commands generated by the data glove are transmitted to the robot via a Bluetooth wireless communication, which makes the whole robotic teleoperation system simple and user friendly. Finally, several experiments are implemented to verify the efficiency of the proposed robotic teleoperation system.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133376163","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 33
Analysis of underwater snake robot locomotion based on a control-oriented model 基于面向控制模型的水下蛇形机器人运动分析
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419055
Anna M. Kohl, K. Pettersen, E. Kelasidi, J. Gravdahl
{"title":"Analysis of underwater snake robot locomotion based on a control-oriented model","authors":"Anna M. Kohl, K. Pettersen, E. Kelasidi, J. Gravdahl","doi":"10.1109/ROBIO.2015.7419055","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419055","url":null,"abstract":"This paper presents an analysis of planar underwater snake robot locomotion in the presence of ocean currents. The robot is assumed to be neutrally buoyant and move fully submerged with a planar sinusoidal gait and limited link angles. As a basis for the analysis, an existing, control-oriented model is further simplified and extended to general sinusoidal gaits. Averaging theory is then employed to derive the averaged velocity dynamics of the underwater snake robot from that model. It is proven that the averaged velocity converges exponentially to an equilibrium, and an analytical expression for calculating the forward velocity of the robot in steady state is derived. A simulation study that validates both the proposed modelling approach and the theoretical results is presented.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115572928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
On study of a wheel-track transformation robot 轮轨变换机器人的研究
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419708
Jianbing Hu, Ansi Peng, Y. Ou, Guolai Jiang
{"title":"On study of a wheel-track transformation robot","authors":"Jianbing Hu, Ansi Peng, Y. Ou, Guolai Jiang","doi":"10.1109/ROBIO.2015.7419708","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419708","url":null,"abstract":"Wheeled robots usually move fast in relatively even surfaces while losing their major mobility on rough terrains. However, the tracked robots are superior to the wheel-types when they are to move on irregular terrains in spite of lower speed. In this paper, a Wheel-Track Transformation robot which combines the advantages of both mobile types by turning into wheel mode or track mode is designed. The robot consists of a robot platform equipped with an extra support mechanism, two wheels with transformation units and an universal wheel. Each transformation unit is composed of a rotating board and a double two-bar linkage mechanism. Three key mechanical components and several locomotion strategies on rough terrains are presented. Meanwhile, a detailed quasi-dynamic analysis is conducted to obtain proper torque capabilities of motors. Three prototypes of the proposed mechanism were manufactured and their fundamental mobility was evaluated by experiments.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114431887","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Scene recognition based on extreme learning machine for digital video archive management 基于极限学习机的数字视频档案管理场景识别
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419003
Dongsheng Cheng, Wenjing Yu, Xiaoling He, Shilong Ni, Junyu Lv, Weibo Zeng, Yuanlong Yu
{"title":"Scene recognition based on extreme learning machine for digital video archive management","authors":"Dongsheng Cheng, Wenjing Yu, Xiaoling He, Shilong Ni, Junyu Lv, Weibo Zeng, Yuanlong Yu","doi":"10.1109/ROBIO.2015.7419003","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419003","url":null,"abstract":"Video is a rich media widely used in many of our daily life applications like education, entertainment, surveillance, etc. In order to retrieve rapidly, it is necessary to establish digital archive for storing these videos. However, it is not realistic to store vast amounts of video data into digital archive artificially. This paper proposes a new method for the task of video digital archive management by employing scene recognition technology based on extreme learning machine (ELM). This paper only focuses on scene recognition technology which is the key step of digital video archive management. Dense scale invariant feature transform (dense SIFT) features are used as features in this proposed method. The 15-Scenes dataset with more than 4000 images is used. Experimental results have shown that this proposed method achieves not only high recognition accuracy but also extremely low computational cost.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114740897","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
3D moth-inspired chemical plume tracking 3D飞蛾启发的化学羽流跟踪
2015 IEEE International Conference on Robotics and Biomimetics (ROBIO) Pub Date : 2015-12-01 DOI: 10.1109/ROBIO.2015.7419031
B. Gao, Hongbo Li, F. Sun
{"title":"3D moth-inspired chemical plume tracking","authors":"B. Gao, Hongbo Li, F. Sun","doi":"10.1109/ROBIO.2015.7419031","DOIUrl":"https://doi.org/10.1109/ROBIO.2015.7419031","url":null,"abstract":"This paper analyzes the conventional moth inspired chemical plume tracking, and presents a 3D moth-inspired CPT using multi-sensors. The aim of CPT is tracking a target chemical flow back to its source and declaring its location. While the moths acturally perform their CPT in 3D space, the majority of related works are based on wheels robots in 2D. Nowadays the rapid development of rotorcrafts makes the 3D plume tracking possible. Hence in this paper, we first present a detailed analysis of moth-inspired CPT work flow, then extend the orthodox moth-inspired CPT from 2D to 3D. Our simulation results demonstrate that our the multi-sensor CPT strategy proposed is a feasible and efficient method to locate odour source.","PeriodicalId":325536,"journal":{"name":"2015 IEEE International Conference on Robotics and Biomimetics (ROBIO)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2015-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117077187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信