{"title":"Use of tactile sensors in enhancing the efficiency of vision-based object localization","authors":"M. Boshra, Hong Zhang","doi":"10.1109/MFI.1994.398446","DOIUrl":null,"url":null,"abstract":"We present a technique to localize polyhedral objects by integrating visual and tactile data. This technique is useful in tasks such as localizing an object in a robot hand. It is assumed that visual data are provided by a monocular visual sensor, while tactile data by a planar-array tactile sensor in contact with the object. Visual data are used to generate a set of hypotheses about the 3D object's pose, while tactile data to assist in verifying the visually-generated pose hypotheses. We specifically focus on using tactile data in hypothesis verification. A set of indexed bounds on the object's six transformation parameters are constructed from the tactile data. These indexed bounds are constructed off-line by expressing them with respect to a tactile-array frame. At run-time, each visually-generated hypothesis is efficiently compared with the touch-based bounds to determine whether to eliminate the hypothesis, or to consider it for further verification. The proposed technique is tested using simulated and real data.<<ETX>>","PeriodicalId":133630,"journal":{"name":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","volume":"29 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-10-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 IEEE International Conference on MFI '94. Multisensor Fusion and Integration for Intelligent Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MFI.1994.398446","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
We present a technique to localize polyhedral objects by integrating visual and tactile data. This technique is useful in tasks such as localizing an object in a robot hand. It is assumed that visual data are provided by a monocular visual sensor, while tactile data by a planar-array tactile sensor in contact with the object. Visual data are used to generate a set of hypotheses about the 3D object's pose, while tactile data to assist in verifying the visually-generated pose hypotheses. We specifically focus on using tactile data in hypothesis verification. A set of indexed bounds on the object's six transformation parameters are constructed from the tactile data. These indexed bounds are constructed off-line by expressing them with respect to a tactile-array frame. At run-time, each visually-generated hypothesis is efficiently compared with the touch-based bounds to determine whether to eliminate the hypothesis, or to consider it for further verification. The proposed technique is tested using simulated and real data.<>