Katrin Wolf, Christian Müller-Tomfelde, Kelvin Cheng, I. Wechsung
{"title":"PinchPad: performance of touch-based gestures while grasping devices","authors":"Katrin Wolf, Christian Müller-Tomfelde, Kelvin Cheng, I. Wechsung","doi":"10.1145/2148131.2148155","DOIUrl":null,"url":null,"abstract":"This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb can always be used as a proprioceptive reference for guiding finger movements, applying embodied knowledge about body structure. In a user study, we tested these touch-based interactions for their performance and users' task-load perception. We combined two iPads together back-to-back to form a double-sided touch screen device: the PinchPad. We discuss the main errors that led to a decrease in accuracy, identify stable features that reduce the error rate, and discuss the role of 'body schema' in designing gesture-based interactions where the user cannot see their hands properly.","PeriodicalId":440364,"journal":{"name":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"28","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Sixth International Conference on Tangible, Embedded and Embodied Interaction","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2148131.2148155","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 28
Abstract
This paper focuses on combining front and back device interaction on grasped devices, using touch-based gestures. We designed generic interactions for discrete, continuous, and combined gesture commands that are executed without hand-eye control because the performing fingers are hidden behind a grasped device. We designed the interactions in such a way that the thumb can always be used as a proprioceptive reference for guiding finger movements, applying embodied knowledge about body structure. In a user study, we tested these touch-based interactions for their performance and users' task-load perception. We combined two iPads together back-to-back to form a double-sided touch screen device: the PinchPad. We discuss the main errors that led to a decrease in accuracy, identify stable features that reduce the error rate, and discuss the role of 'body schema' in designing gesture-based interactions where the user cannot see their hands properly.