Yifan Zhu, Mei Hao, Xupeng Zhu, Quentin Bateux, Alex Wong, Aaron M. Dollar
{"title":"自由力:基于视觉的接触力估计与一个柔顺的手","authors":"Yifan Zhu, Mei Hao, Xupeng Zhu, Quentin Bateux, Alex Wong, Aaron M. Dollar","doi":"10.1126/scirobotics.adq5046","DOIUrl":null,"url":null,"abstract":"<div >Force-sensing capabilities are essential for robot manipulation systems. However, commonly used wrist-mounted force/torque sensors are heavy, fragile, and expensive, and tactile sensors require adding fragile circuitry to the robot fingers while only providing force information local to the contact. Here, we present a vision-based contact force estimator that serves as a more cost-effective and easier-to-implement alternative to existing force sensors by leveraging deformations of compliant hands upon contacts when compliant hands are in use. Our approach uses an estimator that visually observes a specialized compliant robot hand (available open source with easy fabrication through 3D printing) and predicts the contact force on the basis of its elastic deformation upon external forces. Because using wrist-mounted cameras to observe the gripper is common for robot manipulation systems, our method can obtain additional force information provided that the gripper is compliant. We optimized our compliant hand to minimize friction and avoid singularities in finger configurations, and we introduced memory to the estimator to combat the partial observability of the contact forces from the remaining friction and hysteresis. In addition, the estimator was made robust to background distractions and finger occlusions using vision foundation models to segment out the fingers. Although it is less accurate and slower than commercial force/torque sensors, we experimentally demonstrated the accuracy and robustness of our estimator (achieving between 0.2 newton and 0.4 newton error) and its utility during a variety of manipulation tasks using the gripper in the presence of noisy backgrounds and occlusions.</div>","PeriodicalId":56029,"journal":{"name":"Science Robotics","volume":"10 103","pages":""},"PeriodicalIF":27.5000,"publicationDate":"2025-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Forces for free: Vision-based contact force estimation with a compliant hand\",\"authors\":\"Yifan Zhu, Mei Hao, Xupeng Zhu, Quentin Bateux, Alex Wong, Aaron M. Dollar\",\"doi\":\"10.1126/scirobotics.adq5046\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div >Force-sensing capabilities are essential for robot manipulation systems. However, commonly used wrist-mounted force/torque sensors are heavy, fragile, and expensive, and tactile sensors require adding fragile circuitry to the robot fingers while only providing force information local to the contact. Here, we present a vision-based contact force estimator that serves as a more cost-effective and easier-to-implement alternative to existing force sensors by leveraging deformations of compliant hands upon contacts when compliant hands are in use. Our approach uses an estimator that visually observes a specialized compliant robot hand (available open source with easy fabrication through 3D printing) and predicts the contact force on the basis of its elastic deformation upon external forces. Because using wrist-mounted cameras to observe the gripper is common for robot manipulation systems, our method can obtain additional force information provided that the gripper is compliant. We optimized our compliant hand to minimize friction and avoid singularities in finger configurations, and we introduced memory to the estimator to combat the partial observability of the contact forces from the remaining friction and hysteresis. In addition, the estimator was made robust to background distractions and finger occlusions using vision foundation models to segment out the fingers. Although it is less accurate and slower than commercial force/torque sensors, we experimentally demonstrated the accuracy and robustness of our estimator (achieving between 0.2 newton and 0.4 newton error) and its utility during a variety of manipulation tasks using the gripper in the presence of noisy backgrounds and occlusions.</div>\",\"PeriodicalId\":56029,\"journal\":{\"name\":\"Science Robotics\",\"volume\":\"10 103\",\"pages\":\"\"},\"PeriodicalIF\":27.5000,\"publicationDate\":\"2025-06-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Science Robotics\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.science.org/doi/10.1126/scirobotics.adq5046\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ROBOTICS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Science Robotics","FirstCategoryId":"94","ListUrlMain":"https://www.science.org/doi/10.1126/scirobotics.adq5046","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ROBOTICS","Score":null,"Total":0}
Forces for free: Vision-based contact force estimation with a compliant hand
Force-sensing capabilities are essential for robot manipulation systems. However, commonly used wrist-mounted force/torque sensors are heavy, fragile, and expensive, and tactile sensors require adding fragile circuitry to the robot fingers while only providing force information local to the contact. Here, we present a vision-based contact force estimator that serves as a more cost-effective and easier-to-implement alternative to existing force sensors by leveraging deformations of compliant hands upon contacts when compliant hands are in use. Our approach uses an estimator that visually observes a specialized compliant robot hand (available open source with easy fabrication through 3D printing) and predicts the contact force on the basis of its elastic deformation upon external forces. Because using wrist-mounted cameras to observe the gripper is common for robot manipulation systems, our method can obtain additional force information provided that the gripper is compliant. We optimized our compliant hand to minimize friction and avoid singularities in finger configurations, and we introduced memory to the estimator to combat the partial observability of the contact forces from the remaining friction and hysteresis. In addition, the estimator was made robust to background distractions and finger occlusions using vision foundation models to segment out the fingers. Although it is less accurate and slower than commercial force/torque sensors, we experimentally demonstrated the accuracy and robustness of our estimator (achieving between 0.2 newton and 0.4 newton error) and its utility during a variety of manipulation tasks using the gripper in the presence of noisy backgrounds and occlusions.
期刊介绍:
Science Robotics publishes original, peer-reviewed, science- or engineering-based research articles that advance the field of robotics. The journal also features editor-commissioned Reviews. An international team of academic editors holds Science Robotics articles to the same high-quality standard that is the hallmark of the Science family of journals.
Sub-topics include: actuators, advanced materials, artificial Intelligence, autonomous vehicles, bio-inspired design, exoskeletons, fabrication, field robotics, human-robot interaction, humanoids, industrial robotics, kinematics, machine learning, material science, medical technology, motion planning and control, micro- and nano-robotics, multi-robot control, sensors, service robotics, social and ethical issues, soft robotics, and space, planetary and undersea exploration.