{"title":"A particle filtering method for wireless sensor network localization with an aerial robot beacon","authors":"F. Caballero, L. Merino, I. Maza, A. Ollero","doi":"10.1109/ROBOT.2008.4543271","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543271","url":null,"abstract":"This paper presents a new method for the 3D localization of an outdoor wireless sensor network (WSN) by using a single flying beacon-node on-board an autonomous helicopter, which is aware of its position thanks to a GPS device. The technique is based on particle filtering and does not require any prior information about the position of the nodes to be estimated. Its structure and stochastic nature allows a distributed computation of the position of the nodes. The paper shows how the method is very suitable for outdoor applications with robotic data-mule systems. The paper includes a section with experiments.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"384 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115703570","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Driving skill characterization: A feasibility study","authors":"Yilu Zhang, William C. Lin, Y. Chin","doi":"10.1109/ROBOT.2008.4543600","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543600","url":null,"abstract":"Information about driver's driving skill can be used to adapt vehicle control parameters to facilitate the specific driver's needs in terms of vehicle performance and driving pleasure. This paper presents an approach to driving skill characterization from a pattern-recognition perspective. The basic idea is to extract patterns that reflect the driver's driving skill level from the measurements of the driver's behavior and the vehicle response. The preliminary experimental results demonstrate the feasibility of using pattern recognition approach to characterize driver's handling skill. This paper concludes with the discussions of the challenges and future works to bring the proposed technique to practical use.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115750940","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improving force feedback fidelity in wave-variable-based teleoperation","authors":"Y. Ye, P. X. Liu","doi":"10.1109/ROBOT.2008.4543208","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543208","url":null,"abstract":"In wave-variable-based teleoperation systems, the perceived force at the master side is biased due to the nature of wave-variable-based communication. This paper proposes an augmented wave-variable-based approach that can partially cancel the bias portion and improve the fidelity of force feedback significantly. In this approach, the returning wave is augmented by the velocities of the both sides of the communication channel. The steady-state position tracking is not affected by the modification. Passivity of the new teleoperation scheme can be obtained by tuning the bandwidth of a low-pass filter. Hence stability is always achievable. Simulation results demonstrate the effectiveness of the scheme.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114919267","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Design and control of a multifunction myoelectric hand with new adaptive grasping and self-locking mechanisms","authors":"J. Chu, Dong-Hyun Jung, Yun-Jung Lee","doi":"10.1109/ROBOT.2008.4543294","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543294","url":null,"abstract":"This paper presents a multifunction myoelectric hand that is designed with underactuated mechanisms. The linger design allows an adaptive grasp, including adaptation between lingers and phalanges with respect to the shape of an object. In addition, a self-lock is embedded in the metacarpophalangeal joint to prevent back driving when external forces act on the lingers. The thumb design also provides adaptation between phalanges and adds an intermittent rotary motion to the carpometacarpal joint. As a result, the hand can perform versatile grasping motions using only two motors, and is capable of natural and stable grasping without complex sensor and servo systems. Moreover, the adaptive grasping capabilities reduce the requirements of electromyogram pattern recognition, as analogous motions, such as cylindrical and tip grasps, can be classified as one motion.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117045176","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Burgner-Kahrs, Yaokun Zhang, J. Raczkowsky, H. Wörn, G. Eggers, J. Mühling
{"title":"Methods for end-effector coupling in robot assisted interventions","authors":"J. Burgner-Kahrs, Yaokun Zhang, J. Raczkowsky, H. Wörn, G. Eggers, J. Mühling","doi":"10.1109/ROBOT.2008.4543729","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543729","url":null,"abstract":"Robot assisted interventions often require coupling and decoupling of the robot to/from a specific tool. By using manual gripper changing systems these operations are facilitated, but the robot has to approach to and move away from the coupling position. Industrial applications are mostly based on movements which are teached-in, since the working environment is perfectly described (i.e. working cell). Especially in robot assisted surgery we are facing non fixed tools to which the robot has to be coupled (e.g. a holding device attached to a mobilised bone) and restricted working areas with special safety requirements. In this paper we present an automatic end-effector registration method and a semiautomatic coupling procedure exemplarily for robot assisted orthognathic surgery. By using means of an optical localisation system and force- /torque sensing, the coupling procedure is controlled by a multi- sensor data fusion approach. The developed methods can be adapted to any robot assisted intervention.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117249928","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Caging rigid polytopes via finger dispersion control","authors":"Peam Pipattanasomporn, Pawin Vongmasa, A. Sudsang","doi":"10.1109/ROBOT.2008.4543364","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543364","url":null,"abstract":"The object caging problem focuses on designing a formation of fingers that keeps an object within a bounded space without immobilizing it. This paper addresses the problem of designing such formation for object represented by a polytope in any finite dimensional workspace and for any specified number of pointed finger. Our goal is to characterize all the caging sets, each of which corresponds to a largest connected set of initial formations of fingers guaranteed to cage the object, up to maintaining a certain class of real-valued measurement induced by the whole fingers' formation below a critical value. In our previous works, such measurement is simply the distance between two fingers (the formation). We found that it is possible to apply the framework based on graph search from the previous works to broader classes of measurements. In this paper, we introduce two of measurements, called dispersion and concentration and propose a generalized approach to query and to report all caging sets with respect to a given dispersion or concentration.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122122096","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Tracking hidden agents through shadow information spaces","authors":"Jingjin Yu, S. LaValle","doi":"10.1109/ROBOT.2008.4543562","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543562","url":null,"abstract":"This paper addresses problems of inferring the locations of moving agents from combinatorial data extracted by robots that carry sensors. The agents move unpredictably and may be fully distinguishable, partially distinguishable, or completely indistinguishable. The key is to introduce information spaces that extract and maintain combinatorial sensing information. This leads to monitoring the changes in connected components of the shadow region, which is the set of points not visible to any sensors at a given time. When used in combination with a path generator for the robots, the approach solves problems such as counting the number of agents, determining movements of teams of agents, and solving pursuit-evasion problems. An implementation with examples is presented.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116908058","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards detection of orthogonal planes in monocular images of indoor environments","authors":"B. Micusík, H. Wildenauer, M. Vincze","doi":"10.1109/ROBOT.2008.4543335","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543335","url":null,"abstract":"In this paper, we describe the components of a novel algorithm for the extraction of dominant orthogonal planar structures from monocular images taken in indoor environments. The basic building block of our approach is the use of vanishing points and vanishing lines imposed by the frequently observed dominance of three mutually orthogonal vanishing directions in man-made world. Vanishing points are found by an improved approach, taking no assumptions on known internal or external camera parameters. The problem of detecting planar patches is attacked using a probabilistic framework, searching for the maximum a posteriori probability (MAP) in a Markov Random Field (MRF). For this, we propose a novel formulation fusing geometric information obtained from vanishing points and features, such as rectangles and partial rectangles, together with a color-homogeneity criteria imposed by an image over-segmentation. The method was evaluated on a set of images exhibiting largely varying characteristics concerning image quality and scene complexity. Experiments show that the method, despite the variations, works in a stable manner and that its performance compares favorably to the state-of-the-art.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117117721","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Concurrent multi-link deployment of a gravity-assisted underactuated snake robot for aircraft assembly","authors":"B. Roy, H. Asada","doi":"10.1109/ROBOT.2008.4543835","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543835","url":null,"abstract":"This paper presents algorithms for concurrent deployment of multiple links of a gravity-assisted underactuated robot arm. The joints of the hyper-articulated arm have no dedicated actuators, but are activated with gravity. By tilting the base link appropriately, multiple unactuated links may be steered simultaneously to desired angular positions. This underactuated arm design was motivated by the need for a compact snake-like robot that can go into aircraft wings and perform assembly operations using heavy end-effecters. The dynamics of the unactuated links are essentially 2nd order non- holonomic constraints, for which there are no general control algorithms. We perform a controllability analysis to establish the feasibility of multi-link positioning using the available inputs, viz., the biaxial tilts of the base link. We propose a feed-forward control algorithm for simultaneous positioning of multiple links. We also propose an intermittent feedback control scheme to compensate for disturbances acting on the system. We built a 4 link prototype where the base is tilted using a Stewart Platform. The proposed control schemes are implemented on our prototype system. The experimental results indicate the efficacy of the control schemes.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117187544","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Toward designing a robot that learns actions from parental demonstrations","authors":"Y. Nagai, C. Muhl, K. Rohlfing","doi":"10.1109/ROBOT.2008.4543753","DOIUrl":"https://doi.org/10.1109/ROBOT.2008.4543753","url":null,"abstract":"How to teach actions to a robot as well as how a robot learns actions is an important issue to be discussed in designing robot learning systems. Inspired by human parent-infant interaction, we hypothesize that a robot equipped with infant-like abilities can take advantage of parental proper teaching. Parents are known to significantly alter their infant-directed actions versus adult-directed ones, e.g. make more pauses between movements, which is assumed to aid the infants' understanding of the actions. As a first step, we analyzed parental actions using a primal attention model. The model based on visual saliency can detect likely important locations in a scene without employing any knowledge about the actions or the environment. Our statistical analysis revealed that the model was able to extract meaningful structures of the actions, e.g. the initial and final state of the actions and the significant state changes in them, which were highlighted by parental action modifications. We further discuss the issue of designing an infant-like robot that can induce parent-like teaching, and present a human-robot interaction experiment evaluating our robot simulation equipped with the saliency model.","PeriodicalId":351230,"journal":{"name":"2008 IEEE International Conference on Robotics and Automation","volume":"320 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2008-05-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125774259","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}