Matteo Antonio Scrugli;Gianluca Leone;Paola Busia;Luigi Raffo;Paolo Meloni
{"title":"在低功耗 5K-LUT FPGA 上利用尖峰神经网络进行实时 sEMG 处理","authors":"Matteo Antonio Scrugli;Gianluca Leone;Paola Busia;Luigi Raffo;Paolo Meloni","doi":"10.1109/TBCAS.2024.3456552","DOIUrl":null,"url":null,"abstract":"The accurate modeling of hand movement based on the analysis of surface electromyographic (sEMG) signals offers exciting opportunities for the development of complex prosthetic devices and human-machine interfaces, moving from discrete gesture recognition, towards continuous movement tracking. In this study, we present two solutions for real-time sEMG processing, based on lightweight Spiking Neural Networks (SNNs) and efficiently implemented on a Lattice iCE40-UltraPlus FPGA, especially suitable for low-power applications. We first assess the performance in the discrete finger gesture recognition task, considering as a reference the NinaPro DB5 dataset, and demonstrating an accuracy of 83.17% in the classification of twelve different finger gestures. We also consider the more challenging problem of continuous finger force modeling, referencing the Hyser dataset for finger tracking during independent extension and contraction exercises. The assessment reveals a correlation of up to 0.875 with the ground-truth forces. Our systems take advantage of SNNs’ inherent efficiency and, dissipating 11.31 mW in active mode, consume 44.6 µJ for a gesture recognition classification and 1.19 µJ for a force modeling inference. Considering dynamic power-consumption management and the introduction of idle periods, average power drops to 1.84 mW and 3.69 mW for these respective tasks.","PeriodicalId":94031,"journal":{"name":"IEEE transactions on biomedical circuits and systems","volume":"19 1","pages":"68-81"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10669772","citationCount":"0","resultStr":"{\"title\":\"Real-Time sEMG Processing With Spiking Neural Networks on a Low-Power 5K-LUT FPGA\",\"authors\":\"Matteo Antonio Scrugli;Gianluca Leone;Paola Busia;Luigi Raffo;Paolo Meloni\",\"doi\":\"10.1109/TBCAS.2024.3456552\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The accurate modeling of hand movement based on the analysis of surface electromyographic (sEMG) signals offers exciting opportunities for the development of complex prosthetic devices and human-machine interfaces, moving from discrete gesture recognition, towards continuous movement tracking. In this study, we present two solutions for real-time sEMG processing, based on lightweight Spiking Neural Networks (SNNs) and efficiently implemented on a Lattice iCE40-UltraPlus FPGA, especially suitable for low-power applications. We first assess the performance in the discrete finger gesture recognition task, considering as a reference the NinaPro DB5 dataset, and demonstrating an accuracy of 83.17% in the classification of twelve different finger gestures. We also consider the more challenging problem of continuous finger force modeling, referencing the Hyser dataset for finger tracking during independent extension and contraction exercises. The assessment reveals a correlation of up to 0.875 with the ground-truth forces. Our systems take advantage of SNNs’ inherent efficiency and, dissipating 11.31 mW in active mode, consume 44.6 µJ for a gesture recognition classification and 1.19 µJ for a force modeling inference. Considering dynamic power-consumption management and the introduction of idle periods, average power drops to 1.84 mW and 3.69 mW for these respective tasks.\",\"PeriodicalId\":94031,\"journal\":{\"name\":\"IEEE transactions on biomedical circuits and systems\",\"volume\":\"19 1\",\"pages\":\"68-81\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10669772\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on biomedical circuits and systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10669772/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on biomedical circuits and systems","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10669772/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Real-Time sEMG Processing With Spiking Neural Networks on a Low-Power 5K-LUT FPGA
The accurate modeling of hand movement based on the analysis of surface electromyographic (sEMG) signals offers exciting opportunities for the development of complex prosthetic devices and human-machine interfaces, moving from discrete gesture recognition, towards continuous movement tracking. In this study, we present two solutions for real-time sEMG processing, based on lightweight Spiking Neural Networks (SNNs) and efficiently implemented on a Lattice iCE40-UltraPlus FPGA, especially suitable for low-power applications. We first assess the performance in the discrete finger gesture recognition task, considering as a reference the NinaPro DB5 dataset, and demonstrating an accuracy of 83.17% in the classification of twelve different finger gestures. We also consider the more challenging problem of continuous finger force modeling, referencing the Hyser dataset for finger tracking during independent extension and contraction exercises. The assessment reveals a correlation of up to 0.875 with the ground-truth forces. Our systems take advantage of SNNs’ inherent efficiency and, dissipating 11.31 mW in active mode, consume 44.6 µJ for a gesture recognition classification and 1.19 µJ for a force modeling inference. Considering dynamic power-consumption management and the introduction of idle periods, average power drops to 1.84 mW and 3.69 mW for these respective tasks.