{"title":"MICAbot: a robotic platform for large-scale distributed robotics","authors":"M. B. McMickell, B. Goodwine, L. Montestruque","doi":"10.1109/ROBOT.2003.1241823","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241823","url":null,"abstract":"This paper presents a novel robotic platform for experimental research in large-scale distributed robotics and mobile sensor networks. The MICAbot is both inexpensive and flexible making it useful for a wide range of experimental goals. In this paper, we provide a description of the MICAbot design. Furthermore, we also discuss general design considerations involved in designing large-scale distributed robots focusing on cost, size, and functionality.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133018869","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards a factored analysis of legged locomotion models","authors":"R. Altendorfer, D. Koditschek, P. Holmes","doi":"10.1109/ROBOT.2003.1241570","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241570","url":null,"abstract":"In this paper, we report on a new stability analysis for hybrid legged locomotion systems based on factorization of return maps. We apply this analysis to a family of models of the spring loaded inverted pendulum (SLIP) with different leg recirculation strategies. We obtain a necessary condition for the asymptotic stability of those models, which is formulated as an exact algebraic expression despite the non-integrability of the SLIP dynamics. We outline the application of this analysis of other models of legged locomotion and it importance for the stability of legged robots and animals.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133539002","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
I. Elhajj, N. Xi, A. Goradia, Chow Man Kit, Yunhui Liu, T. Fukuda
{"title":"Tele-coordinated control of multi-robot systems via the Internet","authors":"I. Elhajj, N. Xi, A. Goradia, Chow Man Kit, Yunhui Liu, T. Fukuda","doi":"10.1109/ROBOT.2003.1241830","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241830","url":null,"abstract":"The coordination of multi-robots is required in many scenarios for efficiency and task completion. Combined with teleoperation capabilities, coordinating robots provide a powerful tool. Add to this the Internet and now it is possible for multi-experts at multi-remote sites to control multi-robots in a coordinated fashion. For this to be feasible there are several hurdles to be crossed including Internet type delays, uncertainties in the environment and uncertainties in the object manipulated. In addition, there is a need to measure and control the quality of tele-coordination. This paper proposes a measure for the quality of tele-coordination, referred to as the coordination index, and details the design procedure that ensures a system performs at a required index. The theory developed was tested by bilaterally tele-coordinating two mobile manipulators via the Internet. The experimental results confirmed the theory presented.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133306385","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Inverse dynamics and simulation of a 3-DOF spatial parallel manipulator","authors":"Yu-Wen Li, Jinsong Wang, Liping Wang, Xinjun Liu","doi":"10.1109/ROBOT.2003.1242226","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1242226","url":null,"abstract":"Recently the parallel manipulators with less DOF have attracted the researchers, but works on their dynamics are relative few. In this paper, an inverse dynamic formulation is presented by the Newton-Euler approach for a spatial parallel manipulator, which has two translational degrees of freedom and one rotational degree of freedom. The inverse kinematics analysis is firstly performed in closed form. Then the force and moment equilibrium equations for the manipulator are presented. According to the kinematic constraints of the legs and the platform, some joint constraint forces are eliminated and an algorithm to solve the actuator forces is given. In addition, ADAMS is used to perform the kinematic and dynamic simulation for the manipulator. The simulation results are compared to those derived from algebraic formulae and the comparison shows the validity of the mathematical model.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133333714","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Choopar Tan, Y. Zweiri, K. Althoefer, L. Seneviratne
{"title":"On-line soil property estimation for autonomous excavator vehicles","authors":"Choopar Tan, Y. Zweiri, K. Althoefer, L. Seneviratne","doi":"10.1109/ROBOT.2003.1241583","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241583","url":null,"abstract":"This paper presents a novel method for estimating soil properties on-line during excavation tasks such as ground leveling, digging and sheet pilling. The proposed method computes key soil parameters by measuring the forces acting on the excavator bucket whilst being in contact with the soil and minimizing the error between measured forces and estimated forces produced by a real-time capable soil model. Two soil models, the Mohr-Coulomb soil model and the Chen and Liu upper bound soil model, are implemented and researched in the context of this estimation scheme. Parameter optimization is carried out employing the Newton Raphson method. The method is evaluated using experimental data and through comparison with an approach that makes use of graphical intersection for model optimization. The results demonstrate that the proposed Newton Raphson-based method is as accurate as the graphical intersection-based approach, but up to 2000 times faster, and thus, most suitable for on-line soil parameter estimation in an automated system which provides optimized digging trajectories for a given excavation task.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"71 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133467627","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Error-tolerant execution of complex robot tasks based on skill primitives","authors":"Ulrike Thomas, B. Finkemeyer, T. Kröger, F. Wahl","doi":"10.1109/ROBOT.2003.1242062","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1242062","url":null,"abstract":"This paper presents a general approach to specify and execute complex robot tasks considering uncertain environments. Robot tasks are defined by a precise definition of so-called skill primitive nets, which are based on Mason's hybrid force/velocity and position control concept, but it is not limited to force/velocity and position control. Two examples are given to illustrate the formally defined skill primitive nets. We evaluated the controller and the trajectory planner by several experiments. Skill primitives suite very well as interface to robot control systems. The presented hybrid control approach provides a modular, flexible, and robust system; stability is guaranteed, particularly at transitions of two skill primitives. With the interface explained here, the results of compliance motion planning become possible to be examined in real work cells. We have implemented an algorithm to search for mating directions in up to three-dimensional configuration-spaces. Thereby, on one hand we have released compliant motion control concepts and on the other hand we can provide solutions for fine motion and assembly planning. This paper shows, how these two fields can be combined by the general concept of skill primitive nets introduced here, in order to establish a powerful system, which is able to automatically execute prior calculated assembly plans based on CAD-data in uncertain environments.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132268364","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributing 3D manufacturing simulations to realize the Digital Plant","authors":"E. Freund, D. Pensky","doi":"10.1109/ROBOT.2003.1241840","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241840","url":null,"abstract":"In the field of Computer Aided Manufacturing (CAM) solutions are required for the simulation of entire production environments including all hardware and software components inside an installation. In this regard, the so-called Digital Plant became more and more important. Several tools based on simulation techniques exist which support the developing of new production automation solutions and factories. In contrast to this, we introduce an innovative approach to realize the Digital Plant. Rest upon 3D graphical simulation we are able to process enlarged simulation models using distributed computing of the complex environment. The flexibility of the evolved interface between simulator's instances facilitates the coupling of other simulation tools to our system. Adapted from the identical interface we connect real controller systems to the partitions to simulate production control. The integration of the different software components extends the use of manufacturing simulation. In this article, we focus on distributing and simulating complex models to establish the base of the Digital Plant.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134620910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An iterative framework for projection-based image sequence registration","authors":"Joaquin Salas","doi":"10.1109/ROBOT.2003.1241871","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241871","url":null,"abstract":"In this paper, an iterative framework for projection-based image sequence registration to be used by a non-holonomic mobile robot is introduced. Since obtaining complete registration has shown to be difficult and error prone, it is claimed that it is worth pursuing trying to gather at least partial and qualitative information from projections. A tracking method is adapted to select and track features between projections. Some results are shown with sequences of images taken when a mobile robot was heading forward, approximately along its optical axis, and rotating approximately around its optical center. It is shown that is possible to interpret camera motion from the projection of individual frames in an image stream.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133736771","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A new approach for structural credit assignment in distributed reinforcement learning systems","authors":"Zhong Yu, Gu Guo-chang, Zhang Rubo","doi":"10.1109/ROBOT.2003.1241758","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241758","url":null,"abstract":"Most existing algorithm for structural credit assignment are developed for competitive reinforcement learning systems. In competitive reinforcement learning system, agents are activated one by one, so there is only one active agent at a time and structural credit assignment could be implemented by some temporal credit assignment algorithms. In collaborated reinforcement learning systems, agents are activated simultaneously, so how to transform the global reinforcement signal fed back from the environment to a reinforcement vector is a crucial difficulty that could not be slide over. In this article, the first really feasible and efficient structural credit assignment difficulty in collaborated reinforcement learning systems is primarily solved. The experiments show that the algorithm converges very rapidly and the assignment result is quite satisfying.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115834314","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visual odometry from an omnidirectional vision system","authors":"R. Bunschoten, B. Kröse","doi":"10.1109/ROBOT.2003.1241656","DOIUrl":"https://doi.org/10.1109/ROBOT.2003.1241656","url":null,"abstract":"We describe a method for estimating the translation and rotation between two subsequent poses of a moving robot from images taken with an omnidirectional vision system. This allows some form of visual odometry. We use two sorts of projections derived from the omnidirectional image. The rotation and translation direction are determined from panoramic projections. After that, a projection on a plane parallel to the ground is used to estimate the length of the translation vector. Experiments on real and simulated data are carried out.","PeriodicalId":315346,"journal":{"name":"2003 IEEE International Conference on Robotics and Automation (Cat. No.03CH37422)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2003-11-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124276138","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}