{"title":"Automated odometry self-calibration for car-like robots with four-wheel-steering","authors":"K. Bohlmann, Henrik Marks, A. Zell","doi":"10.1109/ROSE.2012.6402609","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402609","url":null,"abstract":"This paper addresses the task of calibrating the kinematic parameters and odometry of car-like robots with dual-axis steering. To achieve this goal only the robots builtin laser rangers and no external tracking systems are employed. We introduce a method to actively calibrate the steering angles of both front and rear steering angles with a multi-input multi-output (MIMO) controller. Using the determined function between steering servo input and steering angle the effective wheelbase and wheel diameters are estimated. We present an automated self-calibration procedure for car-like robots with dual-axis steering. The results are verified using our self-developed outdoor robot platform.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123503362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Gaze estimation using Kinect/PTZ camera","authors":"Reza Jafari, D. Ziou","doi":"10.1109/ROSE.2012.6402633","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402633","url":null,"abstract":"This paper describes a novel method for eye-gaze estimation under normal head movement. In this method, head position and orientation are acquired by Kinect while eye direction is obtained by PTZ camera. We propose the Bayesian multinomial logistic regression based on a variational approximation to construct a gaze mapping function from head and eyes features. Our proposed method eliminates stationary head position, awkward personal calibration procedure and active light source as three common drawbacks in most conventional techniques. The efficiency of the proposed method is validated by performance evaluation for different users under varying head position and orientation.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131619836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A forward model for an active tactile sensor using Echo State Networks","authors":"Nalin Harischandra, V. Dürr","doi":"10.1109/ROSE.2012.6402605","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402605","url":null,"abstract":"Here, we introduce a forward model designed for predicting the expected reading of a bionic tactile sensor (antenna) mounted onto a wheeled robot. The model was used to distinguish self-generated stimulation from true tactile events to the antenna. An Echo State Network (ESN), a special type of recurrent neural network which is suitable for chaotic time series prediction, is used to implement the forward model. Inputs to the ESN are the motor command which sets the position of the antenna, and a local proprioceptive signal which measures the acceleration of the robot platform. The model can successfully be used to detect a tactile contact on the antenna while the robot is moving along a path with obstacles. Such forward models are good candidates to be used in neural yet simple way to eliminate self-stimulation of sensors of other modalities due to ego-motion.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131872331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shigeyuki Uematsu, Yuichi Kobayashi, A. Shimizu, T. Kaneko
{"title":"Prediction of object manipulation using tactile sensor information by a humanoid robot","authors":"Shigeyuki Uematsu, Yuichi Kobayashi, A. Shimizu, T. Kaneko","doi":"10.1109/ROSE.2012.6402611","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402611","url":null,"abstract":"This paper presents a framework of lifting-up manipulation acquisition based on tactile sensing information by a humanoid robot. Feature extraction from sensor information, including tactile information, is presented using linear and nonlinear mappings. Information acquired from sensors is mapped to a lower-dimensional space for predicting success of lifting-up task. Robot judges success or failure of the manipulation using the obtained feature space and object orientation. The proposed method was evaluated by simulation with a humanoid robot. Sensor information obtained at the beginning stage of lifting-up task was utilized to predict whether the robot can accomplish the task without dropping down the object. It was verified that the proposed feature extraction provides sufficient information to predict success of the task. The prediction will be utilized to modify posture of the robot.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134518513","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ali Karime, M. Eid, W. Gueaieb, Abdulmotaleb El Saddik
{"title":"Determining wrist reference kinematics using a sensory-mounted stress ball","authors":"Ali Karime, M. Eid, W. Gueaieb, Abdulmotaleb El Saddik","doi":"10.1109/ROSE.2012.6402630","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402630","url":null,"abstract":"One of the research voids in the study of home-based rehabilitation is the lack of benchmarks of the performance for various body kinematics. The objective of this work is to form a metric to evaluate the wrist motion for rehabilitation applications. The wrist motion components that were considered in this study are the angular velocity and acceleration in each plane of movement, namely Pronation/Supination, Flexion/Extension, and Radial/Ulnar Deviations. Two games were developed to measure wrist motion variables, namely the Cup and plate game (to measure the Supination/Pronation motions), and the Golf game (a horizontal version to measure the Radial/Ulnar motions and a vertical version to measure the Extension/Flexion motions). The derived values can serve as a motion benchmark to detect proper movement and steadiness of the wrist, in order to quantify the quality of patient performance.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131969970","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Zug, F. Penzlin, André Dietrich, T. Nguyen, Sven Albert
{"title":"Are laser scanners replaceable by Kinect sensors in robotic applications?","authors":"S. Zug, F. Penzlin, André Dietrich, T. Nguyen, Sven Albert","doi":"10.1109/ROSE.2012.6402619","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402619","url":null,"abstract":"Laser scanners are omnipresent in robotic applications. Their measurements are used in many scenarios for robust map building, localization, collision avoidance, etc. But regarding the required precise measurement and mechanical system a laser scanner is quite expensive. Hence the robotic community is looking for alternative sensors. Since 2010 a new 3D sensor system - Microsoft Kinect [1] - developed for computer games is available and applied in robotic applications. With an appropriate filter tool-chain its output can be mapped to a 2D laser scanner measurement. The reduced data set is ready to be processed by the established algorithms and methods developed for laser scanners. But will the Kinect sensor replace laser scanners in robotic applications? This paper compares the technical parameters of the new sensor with established laser scanners. Afterwards we investigate the possibilities and limits of a Kinect for three common robotic applications - map building, localization and obstacle avoidance.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121888059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Slippery and sandy ground detection for hexapod robots based on organic computing principles and somatosensory feedback","authors":"Ahmad Al-Homsy, J. Hartmann, E. Maehle","doi":"10.1109/ROSE.2012.6402620","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402620","url":null,"abstract":"Insect-like walking of six-legged robots on unstructured and rough terrain is considered a challenging task. Furthermore, the properties of the walking ground are considered an important issue and a challenge to insure stable adaptive walking. This paper will shed light on the applied decentralized controller approach for detecting slippery and sandy ground and also presents the proposed strategies to overcome these challenges. The novelty of our approach is the evaluation of the local current consumption and angular position of each leg's joint as somatosensory feedback. Backward walking is proposed as a reflex reaction once a slippery ground is detected and an adaptive walking as soon as the robot detects sandy ground. Our approach is based on an organic computing architecture and was tested on a low-cost version of the OSCAR walking robot.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122144527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Real time face detection using geometric constraints, navigation and depth-based skin segmentation on mobile robots","authors":"Duc My Vo, A. Masselli, A. Zell","doi":"10.1109/ROSE.2012.6402617","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402617","url":null,"abstract":"Face detection is an important component for mobile robots to interact with humans in a natural way. Various face detection algorithms for mobile robots have been proposed; however, almost all of them have not yet met the requirements of the accuracy and the speed to run in real time on a robot platform. In this paper, we present a method of combining color and depth images provided by a Kinect camera and navigation information for face detection on mobile robots. This method is shown to be very fast and accurate and runs in real time in indoor environments.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126685894","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Estimation with non-white Gaussian observation noise using a generalised ENSEMBLE KALMAN filter","authors":"J. Curn, D. Marinescu, G. Lacey, V. Cahill","doi":"10.1109/ROSE.2012.6402618","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402618","url":null,"abstract":"Many sensor fusion approaches based on the Kalman filter or its variants assume that sensor measurements are disturbed by a white Gaussian noise, which implies an observation error statistically independent of the state estimate. These methods are often being applied in situations where the white noise assumption may not be satisfied, which potentially leads to overconfidence and a divergence of the filter. In this paper, we derive a new Kalman gain formula that provides an optimal update rule in the presence of a known correlation between errors in the state estimate and an observation, which is caused by a presence of a shared error term. The new method is described in the context of the Ensemble Kalman filter, where such a correlation can be directly estimated from the state and observation samples. The proposed generalised Ensemble Kalman filter is evaluated in a scenario where a mobile robot estimates its global position by fusing visual odometry data with an auto-correlated sequence of measurements from a stand-alone Global Positioning System (GPS) receiver.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134277556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Object recognition based on depth information and associative memory","authors":"S. Puls, Norah Schnorr, H. Wörn","doi":"10.1109/ROSE.2012.6402606","DOIUrl":"https://doi.org/10.1109/ROSE.2012.6402606","url":null,"abstract":"Steady improvement of robotic systems due to developments in the realm of sensing the world enables advances towards human-robot-cooperation. In order for the robot to be reactive in its environment objects need to be identified. In this paper an approach is presented which allows identification of objects in the working area of an industrial robot. Neural Networks are used as associative memory to learn new items and efficiently recognize learned objects.","PeriodicalId":306272,"journal":{"name":"2012 IEEE International Symposium on Robotic and Sensors Environments Proceedings","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2012-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121728329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}