PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029961
Yanxia Zhang, A. Bulling, Hans-Werner Gellersen
{"title":"Discrimination of gaze directions using low-level eye image features","authors":"Yanxia Zhang, A. Bulling, Hans-Werner Gellersen","doi":"10.1145/2029956.2029961","DOIUrl":"https://doi.org/10.1145/2029956.2029961","url":null,"abstract":"In mobile daily life settings, video-based gaze tracking faces challenges associated with changes in lighting conditions and artefacts in the video images caused by head and body movements. These challenges call for the development of new methods that are robust to such influences. In this paper we investigate the problem of gaze estimation, more specifically how to discriminate different gaze directions from eye images. In a 17 participant user study we record eye images for 13 different gaze directions from a standard webcam. We extract a total of 50 features from these images that encode information on color, intensity and orientations. Using mRMR feature selection and a k-nearest neighbor (kNN) classifier we show that we can estimate these gaze directions with a mean recognition performance of 86%.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125187237","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029965
Haiwei Dong, Zhiwei Luo
{"title":"Human factor affects eye movement pattern during riding motorcycle on the mountain","authors":"Haiwei Dong, Zhiwei Luo","doi":"10.1145/2029956.2029965","DOIUrl":"https://doi.org/10.1145/2029956.2029965","url":null,"abstract":"Human's eyes are of great importance in the process of perception, cognition, movement, etc. as about 80% of information about the surrounding world comes from vision. Through analyzing the pattern of eye movement, we can make it clear how human accomplish everyday life with eyes. As human lives in the communities which are artificial environments, various man-made signs, objects and surrounding people have influence on human, particularly on eye movement pattern. To fully understand the eye movement pattern, we have to consider the human factors. This paper focuses on clarifying eye movement pattern during riding motorcycle on the mountain. We use a mobile eye mark tracking system to record the eye motion and the front view. By referring the recorded movie, eye mark analysis and fixation point analysis verify the influence from human factor. In addition, we provide suggestions to promote safe riding.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126913893","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029960
Jurek Breuninger, C. Lange, K. Bengler
{"title":"Implementing gaze control for peripheral devices","authors":"Jurek Breuninger, C. Lange, K. Bengler","doi":"10.1145/2029956.2029960","DOIUrl":"https://doi.org/10.1145/2029956.2029960","url":null,"abstract":"The goal of the project \"Gaze Controlled Interaction with Peripheral Devices\" was to extend the capability of the head based eye tracking system DIKABLIS to detect the gaze allocation to previously defined Areas of Interest (AOI) in real time. This allows initiating various events or commands when a test person is wearing the head unit and the gaze is detected in an AOI. The commands can be used for interaction with different devices. Thus the tool for monitoring and analyzing gaze behavior becomes an interaction medium. With such a gaze control multi-modal interaction concepts could be realized. The projects primary aim was to give people with tetraplegia a mean of controlling devices in their home. The experimental set-up was a TV set that can be controlled by gaze.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121592463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029958
J. Pelz
{"title":"Semantic analysis of mobile eyetracking data","authors":"J. Pelz","doi":"10.1145/2029956.2029958","DOIUrl":"https://doi.org/10.1145/2029956.2029958","url":null,"abstract":"Researchers using laboratory-based eyetracking systems now have access to sophisticated data-analysis tools to reduce raw gaze data, but the huge data sets coming from wearable eyetrackers cannot be analyzed with the same tools. The lack of constraints that make mobile systems such powerful tools prevent the analysis tools designed for static or tracked observers from working with freely moving observers.\u0000 Proposed solutions have included infrared markers hidden in the scene to provide reference points, Simultaneous Localization and Mapping (SLAM), and multi-view geometry techniques that build models from multiple views of a scene. These methods map fixations onto predefined or extracted 3D scene models, allowing traditional static-scene analysis tools to be used.\u0000 Another approach to analysis of mobile eyetracking data is to code fixations with semantically meaningful labels rather than mapping the fixations to fixed 3D locations. This offers two important advantages over the model-based methods; semantic mapping allows coding of dynamic scenes without the need to explicitly track objects, and it provides an inherently flexible and extensible object-based coding scheme.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133706147","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029964
Shiwei Cheng
{"title":"The research framework of eye-tracking based mobile device usability evaluation","authors":"Shiwei Cheng","doi":"10.1145/2029956.2029964","DOIUrl":"https://doi.org/10.1145/2029956.2029964","url":null,"abstract":"Eye-tracking is a valuable tool for mobile device usability research, but there are still many challenges about how to create good usability evaluation, such as the accurate enough eye-movement data from the small view angle on a real mobile device. This paper presents one research framework, which combines the remote eye-tracker and portable eye-tracker for both quantitive and qualitative evaluation. An example is reported in which a mobile device user interface is analyzed in on-screen simulation using a remote eye-tracker, and with the real device using a portable eye-tracker. We get the usability problem lists and design advices at the end. It illustrates the feasibility and effectiveness for the proposed research framework.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124257920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029968
Aiko Hagiwara, A. Sugimoto, K. Kawamoto
{"title":"Saliency-based image editing for guiding visual attention","authors":"Aiko Hagiwara, A. Sugimoto, K. Kawamoto","doi":"10.1145/2029956.2029968","DOIUrl":"https://doi.org/10.1145/2029956.2029968","url":null,"abstract":"The most important part of an information system that assists human activities is a natural interface with human beings. Gaze information strongly reflects the human interest or their attention, and thus, a gaze-based interface is promising for future usage. In particular, if we can smoothly guide the user's visual attention toward a target without interrupting their current visual attention, the usefulness of the gaze-based interface will be highly enhanced. To realize such an interface, this paper proposes a method for editing an image, when given a region in the image, to synthesize the image in which the region is most salient. Our method first computes a saliency map of a given image and then iteratively adjusts the intensity and color until the saliency inside the region becomes the highest for the entire image. Experimental results confirm that our image editing method naturally draws the human visual attention toward our specified region.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114899601","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029966
J. Turner, A. Bulling, Hans-Werner Gellersen
{"title":"Combining gaze with manual interaction to extend physical reach","authors":"J. Turner, A. Bulling, Hans-Werner Gellersen","doi":"10.1145/2029956.2029966","DOIUrl":"https://doi.org/10.1145/2029956.2029966","url":null,"abstract":"Situated public displays and interactive surfaces are becoming ubiquitous in our daily lives. Issues arise with these devices when attempting to interact over a distance or with content that is physically out of reach. In this paper we outline three techniques that combine gaze with manual hand-controlled input to move objects. We demonstrate and discuss how these techniques could be applied to two scenarios involving, (1) a multi-touch surface and (2) a public display and a mobile device.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125159407","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029962
Mélodie Vidal, A. Bulling, Hans-Werner Gellersen
{"title":"Analysing EOG signal features for the discrimination of eye movements with wearable devices","authors":"Mélodie Vidal, A. Bulling, Hans-Werner Gellersen","doi":"10.1145/2029956.2029962","DOIUrl":"https://doi.org/10.1145/2029956.2029962","url":null,"abstract":"Eye tracking research in human-computer interaction and experimental psychology traditionally focuses on stationary devices and a small number of common eye movements. The advent of pervasive eye tracking promises new applications, such as eye-based mental health monitoring or eye-based activity and context recognition. These applications might require further research on additional eye movement types such as smooth pursuits and the vestibulo-ocular reflex as these movements have not been studied as extensively as saccades, fixations and blinks. In this paper we report our first step towards an effective discrimination of these movements. In a user study we collect naturalistic eye movements from 19 people using the two most common measurement techniques (EOG and IR-based). We develop a set of basic signal features that we extract from the collected eye movement data and show that a feature-based approach has the potential to discriminate between saccades, smooth pursuits, and vestibulo-ocular reflex movements.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"96 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125983892","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029970
C. Tonkin, A. Duchowski, Joshua Kahue, Paul Schiffgens, Frank Rischner
{"title":"Eye tracking over small and large shopping displays","authors":"C. Tonkin, A. Duchowski, Joshua Kahue, Paul Schiffgens, Frank Rischner","doi":"10.1145/2029956.2029970","DOIUrl":"https://doi.org/10.1145/2029956.2029970","url":null,"abstract":"Consumers' visual behavior is compared when shopping for a product on simulated shelving displays of two different sizes: a 11.5 ft. projection canvas and a 15.4 in. laptop screen. Results are compared with search times obtained over virtual (projected) and physical shelves, where recorded search times indicate a tendency toward improved performance with larger displays. Implications for pervasive eye tracking systems indicate consideration of larger, realistic environments.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123248693","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
PETMEI '11Pub Date : 2011-09-18DOI: 10.1145/2029956.2029967
Xinyong Zhang, Pianpian Xu, Qing Zhang, H. Zha
{"title":"Speed-accuracy trade-off in dwell-based eye pointing tasks at different cognitive levels","authors":"Xinyong Zhang, Pianpian Xu, Qing Zhang, H. Zha","doi":"10.1145/2029956.2029967","DOIUrl":"https://doi.org/10.1145/2029956.2029967","url":null,"abstract":"In this paper, we present a target searching experiment to investigate how long is long enough to maintain the speed-accuracy trade-off in eye pointing tasks that use dwell time as the activation mechanism. The experimental task, which took account of three factors including cognitive complexity, dwell time and visual feedback mode, mixes visual search and target acquisition together. In other words, the subjects need to search for and recognize the target before the final selection in each trial. The results clarify the suitable ranges of dwell time for users to avoid wrong selections as possible as they can under different cognitive load conditions. We also discussed the implications for user interface designs.","PeriodicalId":405392,"journal":{"name":"PETMEI '11","volume":"291 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2011-09-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123378087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}