Senlin Fang;Haoran Ding;Yangjun Liu;Jiashu Liu;Yupo Zhang;Yilin Li;Hoiio Kong;Zhengkun Yi
{"title":"Evetac Meets Sparse Probabilistic Spiking Neural Network: Enhancing Snap-Fit Recognition Efficiency and Performance","authors":"Senlin Fang;Haoran Ding;Yangjun Liu;Jiashu Liu;Yupo Zhang;Yilin Li;Hoiio Kong;Zhengkun Yi","doi":"10.1109/LRA.2025.3557744","DOIUrl":null,"url":null,"abstract":"Snap-fit peg-in-hole assembly is common in industrial robotics, particularly for 3 C electronics, where fast and accurate tactile recognition is crucial for protecting fragile components. Event-based optical sensors, such as Evetac, are well-suited for this task due to their high sparsity and sensitivity in detecting small, rapid force changes. However, existing research often converts event data into dense images and processing them with dense methods, leading to higher computational complexity. In this letter, we propose a Sparse Probabilistic Spiking Neural Network (SPSNN) that utilizes sparse convolutions to extract features from the event data, avoiding computations on non-zero cells. We introduce the Forward and Backward Propagation Through Probability (FBPTP) method, which enables simultaneous gradient computation across all time steps, eliminating the need for the step-by-step traversal required by traditional Forward and Backward Propagation Through Time (FBPTT). Additionally, the Temporal Weight Prediction (TWP) method dynamically allocates weights for different time outputs, enhancing recognition performance with minimal impact on model efficiency. We integrate the Evetac sensor compactly into our robotic system and collected two datasets, named Tactile Event Ethernet (TacEve-Eth) and Tactile Event Type-C (TacEve-TC), corresponding to cantilever and annular snap-fit structures. Experiments show that the SPSNN achieves the superior trade-off between recognition performance and efficiency compared to other widely used methods, achieving the highest average recognition performance while reducing inference time by over 90% compared to FBPTT-based dense SNN baselines.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 6","pages":"5353-5360"},"PeriodicalIF":4.6000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10948344/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Snap-fit peg-in-hole assembly is common in industrial robotics, particularly for 3 C electronics, where fast and accurate tactile recognition is crucial for protecting fragile components. Event-based optical sensors, such as Evetac, are well-suited for this task due to their high sparsity and sensitivity in detecting small, rapid force changes. However, existing research often converts event data into dense images and processing them with dense methods, leading to higher computational complexity. In this letter, we propose a Sparse Probabilistic Spiking Neural Network (SPSNN) that utilizes sparse convolutions to extract features from the event data, avoiding computations on non-zero cells. We introduce the Forward and Backward Propagation Through Probability (FBPTP) method, which enables simultaneous gradient computation across all time steps, eliminating the need for the step-by-step traversal required by traditional Forward and Backward Propagation Through Time (FBPTT). Additionally, the Temporal Weight Prediction (TWP) method dynamically allocates weights for different time outputs, enhancing recognition performance with minimal impact on model efficiency. We integrate the Evetac sensor compactly into our robotic system and collected two datasets, named Tactile Event Ethernet (TacEve-Eth) and Tactile Event Type-C (TacEve-TC), corresponding to cantilever and annular snap-fit structures. Experiments show that the SPSNN achieves the superior trade-off between recognition performance and efficiency compared to other widely used methods, achieving the highest average recognition performance while reducing inference time by over 90% compared to FBPTT-based dense SNN baselines.
期刊介绍:
The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.