{"title":"Sensing strategy and recognizing attractor","authors":"C. Yuhui, T. Ikegami, K. Takamasu, S. Ozono","doi":"10.1109/MFI.1994.398461","DOIUrl":"https://doi.org/10.1109/MFI.1994.398461","url":null,"abstract":"The representation of intelligence is analyzed for intelligent behaviours, and its shortcoming is showed. It is proposed that the intelligent sensing behaviour in active sensing is necessary to realize a corresponding ethological behaviour in dynamic environment. As a part of general intelligent measurement methodology, the process of evolving a recognizing attractor is simulated. The behaviour evolution and convergence is described by logistic dynamic model.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124385359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"One-dimensional sequential sensor fusion","authors":"R. Tanner","doi":"10.1109/MFI.1994.398472","DOIUrl":"https://doi.org/10.1109/MFI.1994.398472","url":null,"abstract":"This paper develops a theoretical basis for fusing sensor data in the temporal domain. Examples are used to illustrate the concepts.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"322 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124557009","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Signature search method for 3-D pose refinement with range data","authors":"N. Burtnyk, M. Greenspan","doi":"10.1109/MFI.1994.398438","DOIUrl":"https://doi.org/10.1109/MFI.1994.398438","url":null,"abstract":"In many applications in robotics, the geometry of the task environment is uncertain and so the pose of the target object may be known only approximately. For the object to be grasped successfully its actual pose must be determined using some form of vision sensing. This paper presents a novel method of processing 3-D range data for pose refinement. Given the model of the object and its approximate pose, the algorithm adjusts the pose of the object model for a best fit to the measured 3-D data. The main attributes of this algorithm are that its performance is largely unaffected by background clutter including some tolerance to occlusion and that it imposes no restrictions on the surface shape or representation scheme used for the model.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"93 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134015224","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Velocity based vestibular-visual integration in active sensing system","authors":"T. Yamaguchi, H. Yamasaki","doi":"10.1109/MFI.1994.398394","DOIUrl":"https://doi.org/10.1109/MFI.1994.398394","url":null,"abstract":"It is necessary for most of advanced visual sensing systems to realize visual stabilization and target gazing prior to recognition process when operating in a structurally unconstrained environment. In this paper, a new method of acquiring that fundamental ability by sensor fusion technique is proposed. It is important to choose an appropriate sensor fusion basis, and it is proposed to integrate the visual system with angular velocity sensors and a gaze control system by one of those constraint conditions which govern the relation between measured and control variables. It is also proposed that such a visual sensing system is realized by integrating the systems through their (angular) velocity information. Because the velocity field in the image can be now calculated in real time and the angular velocity can be treated directly by both rate gyroscopes and gaze control motors, such an integration is expected to fit for the visual system which requires quick response. Experimental results show that proposed velocity based visual stabilization is feasible and exhibits better performance than that solely by a single sensation. The reason accounting for this improvement is also theoretically discussed.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131752735","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Propagation mode estimation: a prerequisite for OTH radar fusion","authors":"T. Kurien, P. Milanfar, D. Logan, W. P. Berry","doi":"10.1109/MFI.1994.398406","DOIUrl":"https://doi.org/10.1109/MFI.1994.398406","url":null,"abstract":"Over-the-horizon (OTH) radars operate in the high frequency (HF) band of the radar spectrum and utilize ionospheric reflections to detect and track targets in regions up to 2000 nautical miles from the radar site. OTH radars thus have the capability to serve as cost-effective sensors for detecting and tracking aircraft over large surveillance areas. In this paper we describe an algorithm for fusing data from multiple OTH radars. We show that estimation of the radar signal propagation mode is tightly coupled to the tracking and correlation functions in the fusion algorithm. Results using test data collected from two operational OTH radars show that fusion improves both track continuity and track accuracy.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132191709","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Distributed panoramic sensing in multiagent robotics","authors":"M. J. Barth, H. Ishiguro","doi":"10.1109/MFI.1994.398381","DOIUrl":"https://doi.org/10.1109/MFI.1994.398381","url":null,"abstract":"In the field of multiagent robotics, much emphasis has been placed on explicit communication strategies between robots, and little on sensing. Sensing is often used on individual robots for avoiding obstacles, however the authors believe its use at a higher level can be advantageous for multiagent cooperation. The authors introduce a concept of cooperation by observation that features the use of panoramic vision sensing applied to multiagent robotics. Using panoramic vision techniques, a robot can view 360/spl deg/ around itself and also obtain coarse range information to objects in its environment. By also observing other robots in the robot society, localizing oneself within the group can be achieved. Thus each robot can observe its local area and create a local map. These distributed local maps centered around each robot can then be integrated into a larger global map based on the relative localization information between robots. Further, by sensing the positions of other robots and objects, a set of simple behaviors can be used to effectively explore the environment. Each robot calculates the range uncertainty to objects while viewing, and then moves to minimize that uncertainty. Multiagent exploration based on these behaviors has been shown to be effective through preliminary experimentation.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"389 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133354283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Motion planning for multiple obstacles avoidance of autonomous mobile robot using hierarchical fuzzy rules","authors":"T. Aoki, M. Matsuno, T. Suzuki, S. Okuma","doi":"10.1109/MFI.1994.398454","DOIUrl":"https://doi.org/10.1109/MFI.1994.398454","url":null,"abstract":"This paper presents the motion planning for an autonomous mobile robot to move toward the goal avoiding multiple moving obstacles. The autonomous robot can control itself with both velocity and steering controls decided by the fuzzy algorithm. The algorithm has a hierarchical structure which consists of three levels involving fuzzy logic modules. In the lower level of the algorithm, the control inputs of velocity and steering are decided independently. In the middle level, the module called fuzzy balancer adjusts and combines these inputs in order not to conflict each other. In the upper level, the control input to the goal and multicontrol inputs for multiple obstacles are combined to achieve the desired motion. In this paper, we propose the algorithm of hierarchical fuzzy rules and show the simulation results.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129053456","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MAVI: a multi-agent system for visual integration","authors":"O. Boissier, Y. Demazeau","doi":"10.1109/MFI.1994.398382","DOIUrl":"https://doi.org/10.1109/MFI.1994.398382","url":null,"abstract":"This paper presents a control architecture for building integrated vision systems. This system is based on a multi-agent approach which allows the definition of an open and flexible architecture for the integration of visual modules. It is shown how this architecture permits the dynamic definition of the control with some visual goals to satisfy in the context of an active vision system.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115788917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Recognition of 3D objects by a 3-fingered robot hand equipped with tactile and force sensors","authors":"H. Hahn","doi":"10.1109/MFI.1994.398440","DOIUrl":"https://doi.org/10.1109/MFI.1994.398440","url":null,"abstract":"This paper presents an algorithm that recognizes and localizes 3D objects using a 3-fingered robot hand, where an optical tactile sensor and a force sensor are mounted on each finger. Both sensors are capable of measuring the position and normal vector of the test object at the contact point. For efficient matching, the objects are represented by a distribution graph of surface description vectors and a hierarchical table. The measurements of a position and an orientation are described by a possibility sphere and a possibility cone, respectively, whose sizes represent the error characteristics of sensors. The matching object models are selected by fusing sensory data based on theses cones and spheres. When there exist multiple matching object models, the next sensing pose is selected in the multiple interpretation image so that the next sensing operation can discriminate as many remaining object models as possible. The use of hierarchical tables and possibility cones simplifies the matching and the determination of a next sensing pose.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129264655","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Dynamic tactile information sensing for dextrous fingers utilizing displacement parameters of a suspension shell","authors":"T. Okada, Kinya Inamura, T. Suzuki","doi":"10.1109/MFI.1994.398447","DOIUrl":"https://doi.org/10.1109/MFI.1994.398447","url":null,"abstract":"This paper describes an optical displacement measurement method to develop tactile sensors based on a suspension-shell mechanism for dextrous fingers. To express position and orientation of the suspension-shell, three parameters of angular displacements and two parameters of linear displacements are introduced. Besides, a method for measuring these five parameters using inverse optical projection is proposed. Only the longitudinal force of the springs suspending the shell coaxially around the rigid finger body is considered. Thus, the dynamic analysis of balance becomes very simple. Algebraic formulations for estimating magnitude, direction and position of the external force are shown. Also, simulations to verify the proposed procedures are made.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126197541","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}