C. Ferrari, Enrico Pagello, M. Voltolina, Jun Ota, T. Arai
{"title":"Multirobot motion coordination using a deliberative approach","authors":"C. Ferrari, Enrico Pagello, M. Voltolina, Jun Ota, T. Arai","doi":"10.1109/EURBOT.1997.633595","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633595","url":null,"abstract":"The paper investigates the practical use of the motion plan quality and of the motion plan robustness measures coupled with a deliberative scheduling approach, for computing safe motions for multiple mobile robots. The use of any-time algorithms allows one to evaluate the opportunity of looking for alternative solution paths by generating small variations of robot motions in space and in time. By using the concept of plan robustness, we generate several alternative paths that are evaluated through various performance indices and impact factors, by some heuristic rules. These indices allow one to know how much a variation affects a given plan. Finally, we outline some recent experiments.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"233 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115807542","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Active mobile robot localization by entropy minimization","authors":"Wolfram Burgard, D. Fox, S. Thrun","doi":"10.1109/EURBOT.1997.633623","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633623","url":null,"abstract":"Localization is the problem of determining the position of a mobile robot from sensor data. Most existing localization approaches are passive, i.e., they do not exploit the opportunity to control the robot's effecters during localization. This paper proposes an active localization approach. The approach provides rational criteria for (1) setting the robot's motion direction (exploration), and (2) determining the pointing direction of the sensors so as to most efficiently localize the robot. Furthermore, it is able to deal with noisy sensors and approximate world models. The appropriateness of our approach is demonstrated empirically using a mobile robot in a structured office environment.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121520232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spatial learning with perceptually grounded representations","authors":"C. Balkenius","doi":"10.1109/EURBOT.1997.633549","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633549","url":null,"abstract":"The goal of this paper is to develop the foundation for a spatial navigation without objective representations. Rather than building the spatial representations on a Euclidean space, a weaker conception of space is used which has a closer connection to perception. A type of spatial representation is described that uses perceptual information directly to define regions in space. By combining such regions, it is possible to derive a number of useful spatial representations such as place-fields, paths and topological maps. Compared to other methods, the representations of the presented approach have the advantage that they are always grounded in the perceptual abilities of the robot, and thus, more likely to function correctly.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"205 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133171178","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Navigation and guidance of an intelligent mobile robot","authors":"H. Hu, D. Gu, M. Brady","doi":"10.1109/EURBOT.1997.633597","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633597","url":null,"abstract":"This paper presents an experimental mobile robot designed to operate autonomously within both indoor and outdoor environments. A sensor-based autonomous navigation architecture for a dynamically changing environment is described. Emphasis is placed on two important issues: autonomous navigation and smooth guidance of the robot. Several trajectory models are adopted to generate continuous-curvature paths in order to cope with the nonholonomic constraint of the robot and unexpected obstacles. A smooth guidance algorithm has been used to track the planned path. Experimental results demonstrate its performance.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115278634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Active acceleration compensation using a Stewart-platform on a mobile robot","authors":"R. Graf, R. Dillmann","doi":"10.1109/EURBOT.1997.633569","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633569","url":null,"abstract":"If an object is transported on a mobile platform, all accelerations of the mobile platform affect the object. This is of course undesirable, since accelerations can move or even damage the object. Stewart-platforms are mostly used for simulation, where the platform generates accelerations that increase the simulation's quality. Vice versa, it's possible to use a Stewart-platform mounted on a mobile platform to compensate the unwanted accelerations by generating a tilt. The necessary movement of the platform is calculated by a washout-filter. Applications of this combination include the transport of liquids in open boxes and medical transports, where the patients must not be affected by any acceleration.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124745973","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mobile robot navigation using recursive motion control","authors":"R. Ellepola, P. Kovesi","doi":"10.1109/EURBOT.1997.633626","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633626","url":null,"abstract":"As a mobile robot travels towards its goal, sensor input is used to modify the path of the robot to prevent collision with obstacles. In conventional programming, when an obstacle is detected by the sensors, control loops are escaped according to the requirements of the executing navigation task. Thus navigational routines have to be written in a loop form with the necessary exit conditions built in. This paper presents an alternative approach, recursive motion control. Recovery paths around obstacles are planned and executed by recursively invoking the motion control software. The approach allows multiple obstacles to be handled in a consistent manner. On completion of the obstacle avoiding sub-path(s) the level of recursion is dropped to allow resumption of the original motion.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115239093","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Inducing topological maps from task-oriented perception and exploratory behaviors","authors":"G. Beccari, S. Caselli, F. Zanichelli, D. Diemmi","doi":"10.1109/EURBOT.1997.633619","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633619","url":null,"abstract":"We describe motor and perceptual behaviors that have proven useful for indoor navigation of an autonomous mobile robot. These behaviors take advantage of the large amount of structure that characterizes many indoor, office-like environments. Based on pre-existing structural landmarks, a mobile robot has the ability to explore, map, and navigate one among several office buildings sharing similar structural features, while coping with slow environment variations and local dynamics. The mobile robot develops and maintains an internal representation of the environment in terms of a topological and qualitative map. The types of structural features suitable as navigation landmarks largely depend upon the available robot sensoriality. Adequate navigation performance is achieved by subdividing perception and navigation into a number of behaviors layered upon a multi-threaded real-time control architecture.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"438 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116705453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Extension of the ALVINN-architecture for robust visual guidance of a miniature robot","authors":"M. Krabbes, H.-J. Bohme, V. Stephan, H. Groß","doi":"10.1109/EURBOT.1997.633545","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633545","url":null,"abstract":"Extensions of the ALVINN architecture are introduced for a KHEPERA miniature robot to navigate visually robust in a labyrinth. The reimplementation of the ALVINN-approach demonstrates, that also in indoor-environments a complex visual robot navigation is achievable using a direct input-output-mapping with a multilayer perceptron network, which is trained by expert-cloning. With the extensions it succeeds to overcome the restrictions of the small visual field of the camera by completing the input vector with history-components, introduction of the velocity dimension and evaluation of the network's output by a dynamic neural field. This creates the prerequisites to take turns which are no longer visible in the actual image and so make use of several alternatives of actions.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"73 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121419060","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Non-supervised chromatic illuminant: corrector for autonomous robots","authors":"D. Marini, A. Rizzi","doi":"10.1109/EURBOT.1997.633611","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633611","url":null,"abstract":"One of the well-known problems in colour image interpretation is the colour-constancy problem. Autonomous robots that use colour information to select objects or landmarks can be deceived in presence of heavy coloured illuminants. Classic chromatic filtering presupposes detailed information about light source characteristics, but this is not always possible. The presence of emergency lights or different kinds of light sources can heavily influence object colour. Retinex theory, by Land and McCann (1971), can resolve these problems. This theory gives color perception on a color space based on three brightness computed as relative reflectance along multiple exploration paths of the perceived scene. This paper considers the application of this theory in order to allow automatic colour detection in autonomous robots. The algorithm has been tested on simple coloured scenes illuminated with different light sources. The results obtained are compared.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"246 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115230082","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visual behaviours for binocular tracking","authors":"A. Bernardino, J. Santos-Victor","doi":"10.1109/EURBOT.1997.633543","DOIUrl":"https://doi.org/10.1109/EURBOT.1997.633543","url":null,"abstract":"This paper presents a binocular tracking system based on the integration of visual behaviours. Biologically motivated behaviours, vergence and pursuit, cooperate as parallel, complementary and highly coupled processes in the tracking system, simplifying the acquisition of perceptual information and system modeling and control. The use of a space variant image representation and low-level visual cues as feedback signals in a closed loop control architecture, allow real-time and reliable performance for each behaviour, despite the low precision of the algorithms and modeling errors. The behaviours are integrated and the overall system is implemented in a stereo head running at real-time (12.5 Hz), without any specific processing hardware. Results are presented for objects of different shapes and motions, illustrating that tracking can be robustly achieved by the cooperation of purposively designed behaviours, tuned to specific subgoals.","PeriodicalId":129683,"journal":{"name":"Proceedings Second EUROMICRO Workshop on Advanced Mobile Robots","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1997-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129784583","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}