Spiking neural network tactile classification method with faster and more accurate membrane potential representation

IF 2.5 Q2 ENGINEERING, INDUSTRIAL
Jing Yang, Zukun Yu, Xiaoyang Ji, Zhidong Su, Shaobo Li, Yang Cao
{"title":"Spiking neural network tactile classification method with faster and more accurate membrane potential representation","authors":"Jing Yang,&nbsp;Zukun Yu,&nbsp;Xiaoyang Ji,&nbsp;Zhidong Su,&nbsp;Shaobo Li,&nbsp;Yang Cao","doi":"10.1049/cim2.70004","DOIUrl":null,"url":null,"abstract":"<p>Robot perception is an important topic in artificial intelligence field, and tactile recognition in particular is indispensable for human–computer interaction. Efficiently classifying data obtained by touch sensors has long been an issue. In recent years, spiking neural networks (SNNs) have been widely used in tactile data categorisation due to their temporal information processing benefits, low power consumption, and high biological dependability. However, traditional SNN classification methods often encounter under-convergence when using membrane potential representation, decreasing their classification accuracy. Meanwhile, due to the time-discrete nature of SNN models, classification requires a significant time overhead, which restricts their real-time tactile sensing application potential. Considering these concerns, the authors propose a faster and more accurate SNN tactile classification approach using improved membrane potential representation. This method effectively overcomes model convergence problems by optimising the membrane potential expression and the relationship between the loss function and network parameters while significantly reducing the time overhead and enhancing the classification accuracy and robustness of the model. The experimental results show that the propose approach improves the classification accuracy by 4.16% and 2.71% and reduces the overall time by 8.00% and 8.14% on the EvTouch-Containers dataset and EvTouch-Objects dataset, respectively, when compared with existing models.</p>","PeriodicalId":33286,"journal":{"name":"IET Collaborative Intelligent Manufacturing","volume":"6 4","pages":""},"PeriodicalIF":2.5000,"publicationDate":"2024-11-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/cim2.70004","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IET Collaborative Intelligent Manufacturing","FirstCategoryId":"1085","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/cim2.70004","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ENGINEERING, INDUSTRIAL","Score":null,"Total":0}
引用次数: 0

Abstract

Robot perception is an important topic in artificial intelligence field, and tactile recognition in particular is indispensable for human–computer interaction. Efficiently classifying data obtained by touch sensors has long been an issue. In recent years, spiking neural networks (SNNs) have been widely used in tactile data categorisation due to their temporal information processing benefits, low power consumption, and high biological dependability. However, traditional SNN classification methods often encounter under-convergence when using membrane potential representation, decreasing their classification accuracy. Meanwhile, due to the time-discrete nature of SNN models, classification requires a significant time overhead, which restricts their real-time tactile sensing application potential. Considering these concerns, the authors propose a faster and more accurate SNN tactile classification approach using improved membrane potential representation. This method effectively overcomes model convergence problems by optimising the membrane potential expression and the relationship between the loss function and network parameters while significantly reducing the time overhead and enhancing the classification accuracy and robustness of the model. The experimental results show that the propose approach improves the classification accuracy by 4.16% and 2.71% and reduces the overall time by 8.00% and 8.14% on the EvTouch-Containers dataset and EvTouch-Objects dataset, respectively, when compared with existing models.

具有更快、更准确膜电位表征的尖峰神经网络触觉分类方法
机器人感知是人工智能领域的一个重要课题,尤其是触觉识别在人机交互中不可或缺。如何对触摸传感器获取的数据进行有效分类一直是个问题。近年来,尖峰神经网络(SNN)因其在时间信息处理方面的优势、低功耗和高生物依赖性而被广泛应用于触觉数据分类。然而,传统的尖峰神经网络分类方法在使用膜电位表示时经常会遇到收敛不足的问题,从而降低了分类的准确性。同时,由于 SNN 模型的时间离散性,分类需要大量的时间开销,限制了其实时触觉传感的应用潜力。考虑到这些问题,作者利用改进的膜电位表示法提出了一种更快、更准确的 SNN 触觉分类方法。该方法通过优化膜电位表达式以及损失函数和网络参数之间的关系,有效克服了模型收敛问题,同时显著减少了时间开销,提高了分类精度和模型的鲁棒性。实验结果表明,在 EvTouch-Containers 数据集和 EvTouch-Objects 数据集上,与现有模型相比,提出的方法分别提高了 4.16% 和 2.71% 的分类准确率,并减少了 8.00% 和 8.14% 的总体时间。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IET Collaborative Intelligent Manufacturing
IET Collaborative Intelligent Manufacturing Engineering-Industrial and Manufacturing Engineering
CiteScore
9.10
自引率
2.40%
发文量
25
审稿时长
20 weeks
期刊介绍: IET Collaborative Intelligent Manufacturing is a Gold Open Access journal that focuses on the development of efficient and adaptive production and distribution systems. It aims to meet the ever-changing market demands by publishing original research on methodologies and techniques for the application of intelligence, data science, and emerging information and communication technologies in various aspects of manufacturing, such as design, modeling, simulation, planning, and optimization of products, processes, production, and assembly. The journal is indexed in COMPENDEX (Elsevier), Directory of Open Access Journals (DOAJ), Emerging Sources Citation Index (Clarivate Analytics), INSPEC (IET), SCOPUS (Elsevier) and Web of Science (Clarivate Analytics).
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信