Hong Li, Wei Yang, Jianxin Wang, Yang Xu, Liusheng Huang
{"title":"WiFinger: talk to your smart devices with finger-grained gesture","authors":"Hong Li, Wei Yang, Jianxin Wang, Yang Xu, Liusheng Huang","doi":"10.1145/2971648.2971738","DOIUrl":null,"url":null,"abstract":"In recent literatures, WiFi signals have been widely used to \"sense\" people's locations and activities. Researchers have exploited the characteristics of wireless signals to \"hear\" people's talk and \"see\" keystrokes by human users. Inspired by the excellent work of relevant scholars, we turn to explore the field of human-computer interaction using finger-grained gestures under WiFi environment. In this paper, we present Wi-Finger - the first solution using ubiquitous wireless signals to achieve number text input in WiFi devices. We implement a prototype of WiFinger on a commercial Wi-Fi infrastructure. Our scheme is based on the key intuition that while performing a certain gesture, the fingers of a user move in a unique formation and direction and thus generate a unique pattern in the time series of Channel State Information (CSI) values. WiFinger is deigned to recognize a set of finger-grained gestures, which are further used to realize continuous text input in off-the-shelf WiFi devices. As the results show, WiFinger achieves up to 90.4% average classification accuracy for recognizing 9 digits finger-grained gestures from American Sign Language (ASL), and its average accuracy for single individual number text input in desktop reaches 82.67% within 90 digits.","PeriodicalId":303792,"journal":{"name":"Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"235","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2016 ACM International Joint Conference on Pervasive and Ubiquitous Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2971648.2971738","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 235
Abstract
In recent literatures, WiFi signals have been widely used to "sense" people's locations and activities. Researchers have exploited the characteristics of wireless signals to "hear" people's talk and "see" keystrokes by human users. Inspired by the excellent work of relevant scholars, we turn to explore the field of human-computer interaction using finger-grained gestures under WiFi environment. In this paper, we present Wi-Finger - the first solution using ubiquitous wireless signals to achieve number text input in WiFi devices. We implement a prototype of WiFinger on a commercial Wi-Fi infrastructure. Our scheme is based on the key intuition that while performing a certain gesture, the fingers of a user move in a unique formation and direction and thus generate a unique pattern in the time series of Channel State Information (CSI) values. WiFinger is deigned to recognize a set of finger-grained gestures, which are further used to realize continuous text input in off-the-shelf WiFi devices. As the results show, WiFinger achieves up to 90.4% average classification accuracy for recognizing 9 digits finger-grained gestures from American Sign Language (ASL), and its average accuracy for single individual number text input in desktop reaches 82.67% within 90 digits.