2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)最新文献

筛选
英文 中文
Hand gesture recognition for Human-Robot Interaction for service robot 面向服务机器人的人机交互手势识别
R. Luo, Yen-Chang Wu
{"title":"Hand gesture recognition for Human-Robot Interaction for service robot","authors":"R. Luo, Yen-Chang Wu","doi":"10.1109/MFI.2012.6343059","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343059","url":null,"abstract":"With advances in technology, robots play an important role in our lives. Nowadays, we have more chance to see robots service in our society such as intelligent robot for rescue and for service. Therefore, Human-Robot interaction becomes an essential issue for research. In this paper we introduce a combining method for hand sign recognition. Hand sign recognition is an essential way for Human-Robot Interaction (HRI). Sign language is the most intuitive and direct way to communication for impaired or disabled people. Through the hand or body gestures, the disabled can easily let caregiver or robot know what message they want to convey. In this paper, we propose a combining hands gesture recognition algorithm which combines two distinct recognizers. These two recognizers collectively determine the hand's sign via a process called CAR equation. These two recognizers are aimed to complement the ability of discrimination. To achieve this goal, one recognizer recognizes hand gesture by hand skeleton recognizer (HSR), and the other recognizer is based on support vector machines (SVM). In addition, the corresponding classifiers of SVM are trained using different features like local binary pattern (LBP) and raw data. Furthermore, the trained images are using Bosphorus Hand Database and in addition to taking by us. A set of rules including recognizer switching and combinatorial approach recognizer CAR equation is devised to synthesize the distinctive methods. We have successfully demonstrated gesture recognition experimentally with successful proof of concept.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"99 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127102631","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 26
Estimating the posture of pipeline inspection robot with a 2D Laser Rang Finder 基于二维激光测距仪的管道检测机器人姿态估计
Yuanyuan Hu, Zhangjun Song, Jun‐Hua Zhu
{"title":"Estimating the posture of pipeline inspection robot with a 2D Laser Rang Finder","authors":"Yuanyuan Hu, Zhangjun Song, Jun‐Hua Zhu","doi":"10.1109/MFI.2012.6342999","DOIUrl":"https://doi.org/10.1109/MFI.2012.6342999","url":null,"abstract":"Pipeline network is one of the city's critical infrastructures, such as lots of gas pipes and water pipes exist in public utilities, factories and so on. Regular inspection is required to ensure the static integrity of the pipes and to insure against the problems associated with failure of the pipes. We have developed a pipeline inspection robot equipped with a camera which can walk in the pipes and stream back live video to the base station. In this paper we propose a new method for estimating the posture of the robot in round pipes with a 2D Laser Rang Finder (LRF) and a dual tilt-sensor by using the geometrical characteristic of the round pipes constructed with the point cloud data. Transformation matrix from the robot coordinate system to the global system is deduced. The positions and sizes of pipe defects can be calculated easily relying on the range data and images. Experiments by the inspection robot in dry smooth HDPE pipes are carried out and the results show that the proposed method is useful and valid.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123696977","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Estimation analysis in VSLAM for UAV application 无人机VSLAM中的估计分析
Xiaodong Li, N. Aouf, A. Nemra
{"title":"Estimation analysis in VSLAM for UAV application","authors":"Xiaodong Li, N. Aouf, A. Nemra","doi":"10.1109/MFI.2012.6343039","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343039","url":null,"abstract":"This paper presents an in-depth evaluation of filter algorithms utilized in the estimation of 3D position and attitude for UAV using stereo vision based Visual SLAM integrated with feature detection and matching techniques i.e., SIFT and SURF. The evaluation's aim was to investigate the accuracy and robustness of the filters' estimation for vision based navigation problems. The investigation covered several filter methods and both feature extraction algorithms behave in VSLAM applied to UAV. Statistical analyses were carried out in terms of error rates. The Robustness and relative merits of the approaches are discussed to conclude along with evidence of the filters' performances.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121544329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Localizability estimation for mobile robots based on probabilistic grid map and its applications to localization 基于概率网格图的移动机器人定位估计及其在定位中的应用
Zhe Liu, Weidong Chen, Yong Wang, Jingchuan Wang
{"title":"Localizability estimation for mobile robots based on probabilistic grid map and its applications to localization","authors":"Zhe Liu, Weidong Chen, Yong Wang, Jingchuan Wang","doi":"10.1109/MFI.2012.6343051","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343051","url":null,"abstract":"A novel approach to estimate localizability for mobile robots is presented based on probabilistic grid map (PGM). Firstly, a static localizability matrix is proposed for off-line estimation over the priori PGM. Then a dynamic localizability matrix is proposed to deal with unexpected dynamic changes. These matrices describe both localizability index and localizability direction quantitatively. The validity of the proposed method is demonstrated by experiments in different typical environments. Furthermore, two typical localization-related applications, including active global localization and pose tracking, are presented for illustrating the effectiveness of the proposed localizability estimation method.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129490586","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
Towards autonomous airborne mapping of urban environments 走向城市环境的自主航空测绘
B. Adler, Junhao Xiao
{"title":"Towards autonomous airborne mapping of urban environments","authors":"B. Adler, Junhao Xiao","doi":"10.1109/MFI.2012.6343030","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343030","url":null,"abstract":"This work documents our progress on building an unmanned aerial vehicle capable of autonomously mapping urban environments. This includes localization and tracking of the vehicle's pose, fusion of sensor-data from onboard GNSS receivers, IMUs, laserscanners and cameras as well as realtime path-planning and collision-avoidance. Currently, we focus on a physics-based approach to computing waypoints, which are subsequently used to steer the platform in three-dimensional space. Generation of efficient sensor trajectories for maximized information gain operates directly on unorganized point clouds, creating a perfect fit for environment mapping with commonly used LIDAR sensors and time-of-flight cameras. We present the algorithm's application to real sensor-data and analyze its performance in a virtual outdoor scenario.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130780503","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Monocular heading estimation in non-stationary urban environment 非平稳城市环境下的单目航向估计
Christian Herdtweck, Cristóbal Curio
{"title":"Monocular heading estimation in non-stationary urban environment","authors":"Christian Herdtweck, Cristóbal Curio","doi":"10.1109/MFI.2012.6343057","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343057","url":null,"abstract":"Estimating heading information reliably from visual cues only is an important goal in human navigation research as well as in application areas ranging from robotics to automotive safety. The focus of expansion (FoE) is deemed to be important for this task. Yet, dynamic and unstructured environments like urban areas still pose an algorithmic challenge. We extend a robust learning framework that operates on optical flow and has at center stage a continuous Latent Variable Model (LVM) [1]. It accounts for missing measurements, erroneous correspondences and independent outlier motion in the visual field of view. The approach bypasses classical camera calibration through learning stages, that only require monocular video footage and corresponding platform motion information. To estimate the FoE we present both a numerical method acting on inferred optical flow fields and regression mapping, e.g. Gaussian-Process regression. We also present results for mapping to velocity, yaw, and even pitch and roll. Performance is demonstrated for car data recorded in non-stationary, urban environments.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134368416","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Tracking ground moving extended objects using RGBD data 使用RGBD数据跟踪地面移动扩展对象
M. Baum, F. Faion, U. Hanebeck
{"title":"Tracking ground moving extended objects using RGBD data","authors":"M. Baum, F. Faion, U. Hanebeck","doi":"10.1109/MFI.2012.6343003","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343003","url":null,"abstract":"This paper is about an experimental set-up for tracking a ground moving mobile object from a bird's eye view. In this experiment, an RGB and depth camera is used for detecting moving points. The detected points serve as input for a probabilistic extended object tracking algorithm that simultaneously estimates the kinematic parameters and the shape parameters of the object. By this means, it is easy to discriminate moving objects from the background and the probabilistic tracking algorithm ensures a robust and smooth shape estimate. We provide an experimental evaluation of a recent Bayesian extended object tracking algorithm based on a so-called Random Hypersurface Model and give a comparison with active contour models.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132020829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 19
A sensor fusion approach for localization with cumulative error elimination 基于累积误差消除的传感器融合定位方法
Feihu Zhang, H. Stahle, Guang Chen, Chao-Wei Chen, Carsten Simon, C. Buckl, A. Knoll
{"title":"A sensor fusion approach for localization with cumulative error elimination","authors":"Feihu Zhang, H. Stahle, Guang Chen, Chao-Wei Chen, Carsten Simon, C. Buckl, A. Knoll","doi":"10.1109/MFI.2012.6343009","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343009","url":null,"abstract":"This paper describes a robust approach which improves the precision of vehicle localization in complex urban environments by fusing data from GPS, gyroscope and velocity sensors. In this method, we apply Kalman filter to estimate the position of the vehicle. Compared with other fusion based localization approaches, we process the data in a public coordinate system, called Earth Centred Earth Fixed (ECEF) coordinates and eliminate the cumulative error by its statistics characteristics. The contribution is that it not only provides a sensor fusion framework to estimate the position of the vehicle, but also gives a mathematical solution to eliminate the cumulative error stems from the relative pose measurements (provided by the gyroscope and velocity sensors). The experiments exhibit the reliability and the feasibility of our approach in large scale environment.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114757731","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 34
On Active Sensing methods for localization scenarios with range-based measurements 基于距离测量的定位场景的主动传感方法研究
J. Trapnauskas, M. Romanovas, L. Klingbeil, A. Al-Jawad, M. Trächtler, Y. Manoli
{"title":"On Active Sensing methods for localization scenarios with range-based measurements","authors":"J. Trapnauskas, M. Romanovas, L. Klingbeil, A. Al-Jawad, M. Trächtler, Y. Manoli","doi":"10.1109/MFI.2012.6343013","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343013","url":null,"abstract":"The work demonstrates how the methods of Active Sensing (AS), based on the theory of optimal experimental design, can be applied for a location estimation scenario. The simulated problem consists of several mobile and fixed nodes where each mobile unit is equipped with a gyroscope and an incremental path encoder and is capable to make a selective range measurement to one of several fixed anchors as well as to other moving tags. All available measurements are combined within a fusion filter, while the range measurements are selected with one of the AS methods in order to minimize the position uncertainty under the constraints of a maximum available measurement rate. Different AS strategies are incorporated into a recursive Bayesian estimation framework in the form of Extended Kalman and Particle Filters. The performance of the fusion algorithms augmented with the active sensing techniques is discussed for several scenarios with different measurement rates and a number of fixed or moving tags.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114787873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Multi sensors based ultrasonic human face identification: Experiment and analysis 基于多传感器的超声波人脸识别:实验与分析
Y. Xu, J. Y. Wang, B. Cao, J. Yang
{"title":"Multi sensors based ultrasonic human face identification: Experiment and analysis","authors":"Y. Xu, J. Y. Wang, B. Cao, J. Yang","doi":"10.1109/MFI.2012.6343000","DOIUrl":"https://doi.org/10.1109/MFI.2012.6343000","url":null,"abstract":"This paper presents an ultrasonic sensing based human face identification approach. As a biometric identification method, ultrasonic sensing could detect the geometric structure of faces without being affected by the illumination of the environment. Multi ultrasonic sensors are used for data collection. Continuous Transmitted Frequency Modulated (CTFM) signal is chosen as the detection signal. High Resolution Range Profile (HRRP) is extracted from the echo signal as the feature and a K nearest neighbor (KNN) classifier is used for the face classification. Data fusion is applied to improve the performance for identifying faces with multi facial expressions. Experimental results show a success rate of more than 96.9% when the test database includes 62 persons and 5 facial expressions for each person. The results prove that multi sensors ultrasonic sensing could be a potential competent face identification solution for many applications.","PeriodicalId":103145,"journal":{"name":"2012 IEEE International Conference on Multisensor Fusion and Integration for Intelligent Systems (MFI)","volume":"308 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-11-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123481817","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信