Qingcheng Li, Heng Cao, Y. Lu, Haoqiang Yan, Tao Li
{"title":"Controlling Non-Touch Screens as Touch Screens Using Airpen, a Writing Tool with In-Air Gesturing Mode","authors":"Qingcheng Li, Heng Cao, Y. Lu, Haoqiang Yan, Tao Li","doi":"10.1109/ISSSR.2016.020","DOIUrl":null,"url":null,"abstract":"Keyboard, mouse as fundamental input devices of computer systems, have changed traditional input behavior of human beings & writing experiences. However, handwriting, painting are still indispensable communication styles. In recent years, touch screen has become a prevailing human computer interface on many kinds of digital devices, which makes it possible for people using a pen to take notes as writing on paper. Moreover, for maintaining primitive input habits, body language is also needed to be translated by machines. With the development of embedded sensors of smart devices and artificial intelligence technologies, many new approaches have been used to recognize human actions. In this paper, we propose a system called Airpen system, which bridges two input modes: writing and in-air gesturing controlling of human body language through a pen-shape controller. Airpen system projects touch screen writing trajectories to a projection screen and uses in-air gestures to make control operations of non-touch screens. In this paper, we firstly design an input mode switching metrics using acceleration status, then implement in-air gesturing recognition using support vector machine(SVM) with accelerometer data. Finally, we validate that our input mode switching metrics can hardly be perceived by users and in-air gesture recognition can achieve high accuracy.","PeriodicalId":257409,"journal":{"name":"2016 International Symposium on System and Software Reliability (ISSSR)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"7","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 International Symposium on System and Software Reliability (ISSSR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISSSR.2016.020","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 7
Abstract
Keyboard, mouse as fundamental input devices of computer systems, have changed traditional input behavior of human beings & writing experiences. However, handwriting, painting are still indispensable communication styles. In recent years, touch screen has become a prevailing human computer interface on many kinds of digital devices, which makes it possible for people using a pen to take notes as writing on paper. Moreover, for maintaining primitive input habits, body language is also needed to be translated by machines. With the development of embedded sensors of smart devices and artificial intelligence technologies, many new approaches have been used to recognize human actions. In this paper, we propose a system called Airpen system, which bridges two input modes: writing and in-air gesturing controlling of human body language through a pen-shape controller. Airpen system projects touch screen writing trajectories to a projection screen and uses in-air gestures to make control operations of non-touch screens. In this paper, we firstly design an input mode switching metrics using acceleration status, then implement in-air gesturing recognition using support vector machine(SVM) with accelerometer data. Finally, we validate that our input mode switching metrics can hardly be perceived by users and in-air gesture recognition can achieve high accuracy.