M. Eggimann, Jonas Erb, Philipp Mayer, M. Magno, L. Benini
{"title":"基于新型近距离雷达传感器的低功耗嵌入式手势识别","authors":"M. Eggimann, Jonas Erb, Philipp Mayer, M. Magno, L. Benini","doi":"10.1109/SENSORS43011.2019.8956617","DOIUrl":null,"url":null,"abstract":"This work proposes a low-power high-accuracy embedded hand-gesture recognition using low power short-range radar sensors. The hardware and software match the requirements for battery-operated wearable devices. A 2D Convolutional Neural Network (CNN) using range frequency Doppler features is combined with a Temporal Convolutional Neural Network (TCN) for time sequence prediction. The final algorithm has a model size of only 45723 parameters, yielding a memory footprint of only 91kB. Two datasets containing 11 challenging hand gestures performed by 26 different people have been recorded containing a total of 20210 gesture instances. On the 11 hands, gestures and an accuracy of 87% (26 users) and 92% (single user) have been achieved. Furthermore, the prediction algorithm has been implemented in the GAP8 Parallel Ultra-Low-Power processor by GreenWaves Technologies, showing that live-prediction is feasible with only 21mW of power consumption for the full gesture prediction neural network.","PeriodicalId":6710,"journal":{"name":"2019 IEEE SENSORS","volume":"27 1","pages":"1-4"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"8","resultStr":"{\"title\":\"Low Power Embedded Gesture Recognition Using Novel Short-Range Radar Sensors\",\"authors\":\"M. Eggimann, Jonas Erb, Philipp Mayer, M. Magno, L. Benini\",\"doi\":\"10.1109/SENSORS43011.2019.8956617\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This work proposes a low-power high-accuracy embedded hand-gesture recognition using low power short-range radar sensors. The hardware and software match the requirements for battery-operated wearable devices. A 2D Convolutional Neural Network (CNN) using range frequency Doppler features is combined with a Temporal Convolutional Neural Network (TCN) for time sequence prediction. The final algorithm has a model size of only 45723 parameters, yielding a memory footprint of only 91kB. Two datasets containing 11 challenging hand gestures performed by 26 different people have been recorded containing a total of 20210 gesture instances. On the 11 hands, gestures and an accuracy of 87% (26 users) and 92% (single user) have been achieved. Furthermore, the prediction algorithm has been implemented in the GAP8 Parallel Ultra-Low-Power processor by GreenWaves Technologies, showing that live-prediction is feasible with only 21mW of power consumption for the full gesture prediction neural network.\",\"PeriodicalId\":6710,\"journal\":{\"name\":\"2019 IEEE SENSORS\",\"volume\":\"27 1\",\"pages\":\"1-4\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"8\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE SENSORS\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SENSORS43011.2019.8956617\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE SENSORS","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SENSORS43011.2019.8956617","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Low Power Embedded Gesture Recognition Using Novel Short-Range Radar Sensors
This work proposes a low-power high-accuracy embedded hand-gesture recognition using low power short-range radar sensors. The hardware and software match the requirements for battery-operated wearable devices. A 2D Convolutional Neural Network (CNN) using range frequency Doppler features is combined with a Temporal Convolutional Neural Network (TCN) for time sequence prediction. The final algorithm has a model size of only 45723 parameters, yielding a memory footprint of only 91kB. Two datasets containing 11 challenging hand gestures performed by 26 different people have been recorded containing a total of 20210 gesture instances. On the 11 hands, gestures and an accuracy of 87% (26 users) and 92% (single user) have been achieved. Furthermore, the prediction algorithm has been implemented in the GAP8 Parallel Ultra-Low-Power processor by GreenWaves Technologies, showing that live-prediction is feasible with only 21mW of power consumption for the full gesture prediction neural network.