Evetac Meets Sparse Probabilistic Spiking Neural Network: Enhancing Snap-Fit Recognition Efficiency and Performance

IF 4.6 2区 计算机科学 Q2 ROBOTICS
Senlin Fang;Haoran Ding;Yangjun Liu;Jiashu Liu;Yupo Zhang;Yilin Li;Hoiio Kong;Zhengkun Yi
{"title":"Evetac Meets Sparse Probabilistic Spiking Neural Network: Enhancing Snap-Fit Recognition Efficiency and Performance","authors":"Senlin Fang;Haoran Ding;Yangjun Liu;Jiashu Liu;Yupo Zhang;Yilin Li;Hoiio Kong;Zhengkun Yi","doi":"10.1109/LRA.2025.3557744","DOIUrl":null,"url":null,"abstract":"Snap-fit peg-in-hole assembly is common in industrial robotics, particularly for 3 C electronics, where fast and accurate tactile recognition is crucial for protecting fragile components. Event-based optical sensors, such as Evetac, are well-suited for this task due to their high sparsity and sensitivity in detecting small, rapid force changes. However, existing research often converts event data into dense images and processing them with dense methods, leading to higher computational complexity. In this letter, we propose a Sparse Probabilistic Spiking Neural Network (SPSNN) that utilizes sparse convolutions to extract features from the event data, avoiding computations on non-zero cells. We introduce the Forward and Backward Propagation Through Probability (FBPTP) method, which enables simultaneous gradient computation across all time steps, eliminating the need for the step-by-step traversal required by traditional Forward and Backward Propagation Through Time (FBPTT). Additionally, the Temporal Weight Prediction (TWP) method dynamically allocates weights for different time outputs, enhancing recognition performance with minimal impact on model efficiency. We integrate the Evetac sensor compactly into our robotic system and collected two datasets, named Tactile Event Ethernet (TacEve-Eth) and Tactile Event Type-C (TacEve-TC), corresponding to cantilever and annular snap-fit structures. Experiments show that the SPSNN achieves the superior trade-off between recognition performance and efficiency compared to other widely used methods, achieving the highest average recognition performance while reducing inference time by over 90% compared to FBPTT-based dense SNN baselines.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 6","pages":"5353-5360"},"PeriodicalIF":4.6000,"publicationDate":"2025-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10948344/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

Snap-fit peg-in-hole assembly is common in industrial robotics, particularly for 3 C electronics, where fast and accurate tactile recognition is crucial for protecting fragile components. Event-based optical sensors, such as Evetac, are well-suited for this task due to their high sparsity and sensitivity in detecting small, rapid force changes. However, existing research often converts event data into dense images and processing them with dense methods, leading to higher computational complexity. In this letter, we propose a Sparse Probabilistic Spiking Neural Network (SPSNN) that utilizes sparse convolutions to extract features from the event data, avoiding computations on non-zero cells. We introduce the Forward and Backward Propagation Through Probability (FBPTP) method, which enables simultaneous gradient computation across all time steps, eliminating the need for the step-by-step traversal required by traditional Forward and Backward Propagation Through Time (FBPTT). Additionally, the Temporal Weight Prediction (TWP) method dynamically allocates weights for different time outputs, enhancing recognition performance with minimal impact on model efficiency. We integrate the Evetac sensor compactly into our robotic system and collected two datasets, named Tactile Event Ethernet (TacEve-Eth) and Tactile Event Type-C (TacEve-TC), corresponding to cantilever and annular snap-fit structures. Experiments show that the SPSNN achieves the superior trade-off between recognition performance and efficiency compared to other widely used methods, achieving the highest average recognition performance while reducing inference time by over 90% compared to FBPTT-based dense SNN baselines.
Evetac满足稀疏概率峰值神经网络:提高Snap-Fit识别效率和性能
卡扣式孔内钉装配在工业机器人中很常见,特别是在3c电子产品中,快速准确的触觉识别对于保护易碎部件至关重要。基于事件的光学传感器,如Evetac,非常适合这项任务,因为它们在检测小而快速的力变化方面具有高稀疏性和灵敏度。然而,现有的研究往往将事件数据转换成密集的图像,并使用密集的方法进行处理,从而导致较高的计算复杂度。在这封信中,我们提出了一个稀疏概率尖峰神经网络(SPSNN),它利用稀疏卷积从事件数据中提取特征,避免了对非零细胞的计算。我们介绍了通过概率向前和向后传播(FBPTP)方法,该方法可以在所有时间步长同时进行梯度计算,从而消除了传统的通过时间向前和向后传播(FBPTT)所需的逐步遍历。此外,时间权重预测(TWP)方法为不同的时间输出动态分配权重,在对模型效率影响最小的情况下提高了识别性能。我们将Evetac传感器集成到机器人系统中,并收集了两个数据集,分别命名为触觉事件以太网(TacEve-Eth)和触觉事件Type-C (TacEve-TC),对应于悬臂式和环形卡扣式结构。实验表明,与其他广泛使用的方法相比,SPSNN在识别性能和效率之间取得了更好的平衡,与基于fbptt的密集SNN基线相比,SPSNN实现了最高的平均识别性能,同时将推理时间减少了90%以上。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Robotics and Automation Letters
IEEE Robotics and Automation Letters Computer Science-Computer Science Applications
CiteScore
9.60
自引率
15.40%
发文量
1428
期刊介绍: The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信