Florian Jungwirth, Michael Haslgrübler, A. Ferscha
{"title":"Contour-guided gaze gestures: eye-based interaction with everyday objects and IoT devices","authors":"Florian Jungwirth, Michael Haslgrübler, A. Ferscha","doi":"10.1145/3131542.3140262","DOIUrl":null,"url":null,"abstract":"The eyes are gaining increasing interest within the HCI (human-computer interaction) community as they are a fast and accurate input modality. However, the applicability of mobile eye-based HCI so far is restricted by several issues, such as calibration or the Midas Touch Problem [5]. In this work we propose the idea of contour-guided gaze gestures, which overcome these problems by relying on relative eye movements, as users trace the contours of (interactive) objects within a smart environment. Matching the trajectory of the eye movements and the contour's shape allows to estimate which object was interacted with and to trigger the corresponding actions. We describe the concept of the system and illustrate several application scenarios, demonstrating its value.","PeriodicalId":166408,"journal":{"name":"Proceedings of the Seventh International Conference on the Internet of Things","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Seventh International Conference on the Internet of Things","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3131542.3140262","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
The eyes are gaining increasing interest within the HCI (human-computer interaction) community as they are a fast and accurate input modality. However, the applicability of mobile eye-based HCI so far is restricted by several issues, such as calibration or the Midas Touch Problem [5]. In this work we propose the idea of contour-guided gaze gestures, which overcome these problems by relying on relative eye movements, as users trace the contours of (interactive) objects within a smart environment. Matching the trajectory of the eye movements and the contour's shape allows to estimate which object was interacted with and to trigger the corresponding actions. We describe the concept of the system and illustrate several application scenarios, demonstrating its value.