Yiming Qiao , Chaofan Zhang , Shaowei Cui , Lu Cao , Zhigang Wang , Peng Wang , Shuo Wang
{"title":"机器人操作的神经形态视觉触觉滑动感知","authors":"Yiming Qiao , Chaofan Zhang , Shaowei Cui , Lu Cao , Zhigang Wang , Peng Wang , Shuo Wang","doi":"10.1016/j.robot.2025.105191","DOIUrl":null,"url":null,"abstract":"<div><div>Visuotactile sensing technology has received extensive attention in the tactile sensing community due to its stable high-resolution deformation sensing capabilities. However, the existing visuotactile sensing methods are far from humanoid neural information processing mechanism. To address this gap, we propose a neuromorphic visuotactile slip detection method named VT-SNN using Tactile Address-Event Representation (TAER) encoding combined with brain-inspired Spiking Neural Network (SNN) modeling in this paper. Our extensive experimental results demonstrate that the VT-SNN achieves slip detection accuracy of 99.59% and F1 score of 99.28%, which is comparable to Artificial Neural Networks (ANNs) while exhibiting significant advantages in power dissipation and inference time. Furthermore, we deployed the VT-SNN on Intel neuromorphic computing chip–Loihi and performed closed-loop slip-feedback robotic manipulation tasks such as bottle-cap tightening and loosening. Our closed-loop neuromorphic visuotactile sensing system shows significant promise for high accuracy, low latency, and low power dissipation for robotic dexterous manipulation.</div></div>","PeriodicalId":49592,"journal":{"name":"Robotics and Autonomous Systems","volume":"195 ","pages":"Article 105191"},"PeriodicalIF":5.2000,"publicationDate":"2025-09-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Neuromorphic visuotactile slip perception for robotic manipulation\",\"authors\":\"Yiming Qiao , Chaofan Zhang , Shaowei Cui , Lu Cao , Zhigang Wang , Peng Wang , Shuo Wang\",\"doi\":\"10.1016/j.robot.2025.105191\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Visuotactile sensing technology has received extensive attention in the tactile sensing community due to its stable high-resolution deformation sensing capabilities. However, the existing visuotactile sensing methods are far from humanoid neural information processing mechanism. To address this gap, we propose a neuromorphic visuotactile slip detection method named VT-SNN using Tactile Address-Event Representation (TAER) encoding combined with brain-inspired Spiking Neural Network (SNN) modeling in this paper. Our extensive experimental results demonstrate that the VT-SNN achieves slip detection accuracy of 99.59% and F1 score of 99.28%, which is comparable to Artificial Neural Networks (ANNs) while exhibiting significant advantages in power dissipation and inference time. Furthermore, we deployed the VT-SNN on Intel neuromorphic computing chip–Loihi and performed closed-loop slip-feedback robotic manipulation tasks such as bottle-cap tightening and loosening. Our closed-loop neuromorphic visuotactile sensing system shows significant promise for high accuracy, low latency, and low power dissipation for robotic dexterous manipulation.</div></div>\",\"PeriodicalId\":49592,\"journal\":{\"name\":\"Robotics and Autonomous Systems\",\"volume\":\"195 \",\"pages\":\"Article 105191\"},\"PeriodicalIF\":5.2000,\"publicationDate\":\"2025-09-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Robotics and Autonomous Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S092188902500288X\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Autonomous Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S092188902500288X","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
Neuromorphic visuotactile slip perception for robotic manipulation
Visuotactile sensing technology has received extensive attention in the tactile sensing community due to its stable high-resolution deformation sensing capabilities. However, the existing visuotactile sensing methods are far from humanoid neural information processing mechanism. To address this gap, we propose a neuromorphic visuotactile slip detection method named VT-SNN using Tactile Address-Event Representation (TAER) encoding combined with brain-inspired Spiking Neural Network (SNN) modeling in this paper. Our extensive experimental results demonstrate that the VT-SNN achieves slip detection accuracy of 99.59% and F1 score of 99.28%, which is comparable to Artificial Neural Networks (ANNs) while exhibiting significant advantages in power dissipation and inference time. Furthermore, we deployed the VT-SNN on Intel neuromorphic computing chip–Loihi and performed closed-loop slip-feedback robotic manipulation tasks such as bottle-cap tightening and loosening. Our closed-loop neuromorphic visuotactile sensing system shows significant promise for high accuracy, low latency, and low power dissipation for robotic dexterous manipulation.
期刊介绍:
Robotics and Autonomous Systems will carry articles describing fundamental developments in the field of robotics, with special emphasis on autonomous systems. An important goal of this journal is to extend the state of the art in both symbolic and sensory based robot control and learning in the context of autonomous systems.
Robotics and Autonomous Systems will carry articles on the theoretical, computational and experimental aspects of autonomous systems, or modules of such systems.