{"title":"ViTract: Robust Object Shape Perception via Active Visuo-Tactile Interaction","authors":"Anirvan Dutta;Etienne Burdet;Mohsen Kaboli","doi":"10.1109/LRA.2024.3483037","DOIUrl":null,"url":null,"abstract":"An essential problem in robotic systems that are to be deployed in unstructured environments is the accurate and autonomous perception of the shapes of previously unseen objects. Existing methods for shape estimation or reconstruction have leveraged either visual or tactile interactive exploration techniques or have relied on comprehensive visual or tactile information acquired in an offline manner. In this letter, a novel visuo-tactile interactive perception framework- ViTract, is introduced for shape estimation of unseen objects. Our framework estimates the shape of diverse objects robustly using low-dimensional, efficient, and generalizable shape primitives, which are superquadrics. The probabilistic formulation within our framework takes advantage of the complementary information provided by vision and tactile observations while accounting for associated noise. As part of our framework, we propose a novel modality-specific information gain to select the most informative and reliable exploratory action (using vision/tactile) to obtain iterative visuo/tactile information. Our real-robot experiments demonstrate superior and robust performance compared to state-of-the-art visuo-tactile-based shape estimation techniques.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"9 12","pages":"11250-11257"},"PeriodicalIF":4.6000,"publicationDate":"2024-10-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10720798","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10720798/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
An essential problem in robotic systems that are to be deployed in unstructured environments is the accurate and autonomous perception of the shapes of previously unseen objects. Existing methods for shape estimation or reconstruction have leveraged either visual or tactile interactive exploration techniques or have relied on comprehensive visual or tactile information acquired in an offline manner. In this letter, a novel visuo-tactile interactive perception framework- ViTract, is introduced for shape estimation of unseen objects. Our framework estimates the shape of diverse objects robustly using low-dimensional, efficient, and generalizable shape primitives, which are superquadrics. The probabilistic formulation within our framework takes advantage of the complementary information provided by vision and tactile observations while accounting for associated noise. As part of our framework, we propose a novel modality-specific information gain to select the most informative and reliable exploratory action (using vision/tactile) to obtain iterative visuo/tactile information. Our real-robot experiments demonstrate superior and robust performance compared to state-of-the-art visuo-tactile-based shape estimation techniques.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.