{"title":"iPand: Accurate Gesture Input with Smart Acoustic Sensing on Hand","authors":"Shumin Cao, Panlong Yang, Xiangyang Li, Mingshi Chen, Peide Zhu","doi":"10.1109/SAHCN.2018.8397157","DOIUrl":null,"url":null,"abstract":"Finger gesture input is emerged as an increasingly popular means of human-computer interactions. In this demo, we propose iPand, an acoustic sensing system that enables finger gesture input on the skin, which is more convenient, user-friendly and always accessible. Unlike past works, which implement gesture input with dedicated devices, our system exploits passive acoustic sensing to identify the gestures, e.g. swipe left, swipe right, pinch and spread. The intuition underlying our system is that specific gesture emits unique friction sound, which can be captured by the microphone embedded in wearable devices. We then adopt convolutional neural network to recognize the gestures. We implement and evaluate iPand using COTS smartphones and smartwatches. Results from three daily scenarios (i.e., library, lab and cafe) of 10 volunteers show that iPand can achieve the recognition accuracy of 87%, 81% and 77% respectively.","PeriodicalId":139623,"journal":{"name":"2018 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 15th Annual IEEE International Conference on Sensing, Communication, and Networking (SECON)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SAHCN.2018.8397157","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Finger gesture input is emerged as an increasingly popular means of human-computer interactions. In this demo, we propose iPand, an acoustic sensing system that enables finger gesture input on the skin, which is more convenient, user-friendly and always accessible. Unlike past works, which implement gesture input with dedicated devices, our system exploits passive acoustic sensing to identify the gestures, e.g. swipe left, swipe right, pinch and spread. The intuition underlying our system is that specific gesture emits unique friction sound, which can be captured by the microphone embedded in wearable devices. We then adopt convolutional neural network to recognize the gestures. We implement and evaluate iPand using COTS smartphones and smartwatches. Results from three daily scenarios (i.e., library, lab and cafe) of 10 volunteers show that iPand can achieve the recognition accuracy of 87%, 81% and 77% respectively.