Ajay Kannan, Ateendra Ramesh, L. Srinivasan, Vineeth Vijayaraghavan
{"title":"采用MEMS加速度计的低成本静态手势识别系统","authors":"Ajay Kannan, Ateendra Ramesh, L. Srinivasan, Vineeth Vijayaraghavan","doi":"10.1109/GIOTS.2017.8016217","DOIUrl":null,"url":null,"abstract":"The primary objective of the paper is to construct and test a low-cost, minimally supervised gesture recognition system which identifies static gestures efficiently and accurately. The proposed system uses ADXL335 accelerometer sensors which track the gestures and these sensors are interfaced with an Arduino ATMega 2560 micro-controller for data processing and gesture recognition. The software of the system implemented in the micro-controller, features a computationally feasible algorithm which requires only nominal resources to recognize the gestures. The paper further elucidates on minimizing the number of accelerometers to reduce the cost and power-consumption of the system. The performance of the system is assessed using static gestures in the alphabets of the American Sign Language (ASL) across data-sets obtained from 3 trained ASL signers. The average run-time efficiency of the proposed system with a maximum and minimum configuration of 5 and 2 accelerometers was found to be 95.3% and 87.0%, with the cost of these prototype systems being realized at 20 USD and 12.5 USD respectively. It was also found that the system can be trained for the static gestures of the alphabets in ASL under two minutes by a new-user with any system configuration. The authors also feel that the system is compatible with other IoT platforms for interoperability.","PeriodicalId":413939,"journal":{"name":"2017 Global Internet of Things Summit (GIoTS)","volume":"26 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Low-cost static gesture recognition system using MEMS accelerometers\",\"authors\":\"Ajay Kannan, Ateendra Ramesh, L. Srinivasan, Vineeth Vijayaraghavan\",\"doi\":\"10.1109/GIOTS.2017.8016217\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The primary objective of the paper is to construct and test a low-cost, minimally supervised gesture recognition system which identifies static gestures efficiently and accurately. The proposed system uses ADXL335 accelerometer sensors which track the gestures and these sensors are interfaced with an Arduino ATMega 2560 micro-controller for data processing and gesture recognition. The software of the system implemented in the micro-controller, features a computationally feasible algorithm which requires only nominal resources to recognize the gestures. The paper further elucidates on minimizing the number of accelerometers to reduce the cost and power-consumption of the system. The performance of the system is assessed using static gestures in the alphabets of the American Sign Language (ASL) across data-sets obtained from 3 trained ASL signers. The average run-time efficiency of the proposed system with a maximum and minimum configuration of 5 and 2 accelerometers was found to be 95.3% and 87.0%, with the cost of these prototype systems being realized at 20 USD and 12.5 USD respectively. It was also found that the system can be trained for the static gestures of the alphabets in ASL under two minutes by a new-user with any system configuration. The authors also feel that the system is compatible with other IoT platforms for interoperability.\",\"PeriodicalId\":413939,\"journal\":{\"name\":\"2017 Global Internet of Things Summit (GIoTS)\",\"volume\":\"26 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-06-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 Global Internet of Things Summit (GIoTS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/GIOTS.2017.8016217\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 Global Internet of Things Summit (GIoTS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/GIOTS.2017.8016217","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Low-cost static gesture recognition system using MEMS accelerometers
The primary objective of the paper is to construct and test a low-cost, minimally supervised gesture recognition system which identifies static gestures efficiently and accurately. The proposed system uses ADXL335 accelerometer sensors which track the gestures and these sensors are interfaced with an Arduino ATMega 2560 micro-controller for data processing and gesture recognition. The software of the system implemented in the micro-controller, features a computationally feasible algorithm which requires only nominal resources to recognize the gestures. The paper further elucidates on minimizing the number of accelerometers to reduce the cost and power-consumption of the system. The performance of the system is assessed using static gestures in the alphabets of the American Sign Language (ASL) across data-sets obtained from 3 trained ASL signers. The average run-time efficiency of the proposed system with a maximum and minimum configuration of 5 and 2 accelerometers was found to be 95.3% and 87.0%, with the cost of these prototype systems being realized at 20 USD and 12.5 USD respectively. It was also found that the system can be trained for the static gestures of the alphabets in ASL under two minutes by a new-user with any system configuration. The authors also feel that the system is compatible with other IoT platforms for interoperability.