Yang Qifan, Tang Hao, Zhao Xuebing, Li Yin, Zhang Sanfeng
{"title":"Dolphin: Ultrasonic-Based Gesture Recognition on Smartphone Platform","authors":"Yang Qifan, Tang Hao, Zhao Xuebing, Li Yin, Zhang Sanfeng","doi":"10.1109/CSE.2014.273","DOIUrl":null,"url":null,"abstract":"User experience of smart mobile devices can be improved in numerous scenarios with the assist of in-air gesture recognition. Most existing methods proposed by industry and academia are based on special sensors. On the contrary, a special sensor-independent in-air gesture recognition method named Dolphin is proposed in this paper which can be applied to off-the-shelf smart devices directly. The only sensors Dolphin needs are the loudspeaker and microphone embedded in the device. Dolphin emits a continuous 21 KHz tone by the loudspeaker and receive the gesture-reflecting ultrasonic wave by the microphone. The gesture performed is encoded into the reflected ultrasonic in the form of Doppler shift. By combining manual recognition and machine leaning methods, Dolphin extracts features from Doppler shift and recognizes a rich set of pre-defined gestures with high accuracy in real time. Parameter selection strategy and gesture recognition under several scenarios are discussed and evaluated in detail. Dolphin can be adapted to multiple devices and users by training using machine learning methods.","PeriodicalId":258990,"journal":{"name":"2014 IEEE 17th International Conference on Computational Science and Engineering","volume":"59 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"53","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 IEEE 17th International Conference on Computational Science and Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CSE.2014.273","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 53
Abstract
User experience of smart mobile devices can be improved in numerous scenarios with the assist of in-air gesture recognition. Most existing methods proposed by industry and academia are based on special sensors. On the contrary, a special sensor-independent in-air gesture recognition method named Dolphin is proposed in this paper which can be applied to off-the-shelf smart devices directly. The only sensors Dolphin needs are the loudspeaker and microphone embedded in the device. Dolphin emits a continuous 21 KHz tone by the loudspeaker and receive the gesture-reflecting ultrasonic wave by the microphone. The gesture performed is encoded into the reflected ultrasonic in the form of Doppler shift. By combining manual recognition and machine leaning methods, Dolphin extracts features from Doppler shift and recognizes a rich set of pre-defined gestures with high accuracy in real time. Parameter selection strategy and gesture recognition under several scenarios are discussed and evaluated in detail. Dolphin can be adapted to multiple devices and users by training using machine learning methods.