{"title":"Machine-extracted eye gaze features: how well do they correlate to sight-reading abilities of piano players?","authors":"B. Hoanca, T. C. Smith, Kenrick J. Mock","doi":"10.1145/2578153.2578194","DOIUrl":"https://doi.org/10.1145/2578153.2578194","url":null,"abstract":"Skilled piano players are able to decipher and play a musical piece they had never seen before (a skill known as sight-reading). For a sample of 23 piano players of various abilities we consider the correlation between machine-extracted gaze path features and the overall human rating. We find that correlation values (between machine-extracted gaze features and overall human ratings) are statistically similar to correlation values between human-extracted task-related ratings (e.g., note accuracy, error rate) and overall human ratings. These high correlation values suggest that an eye tracking-enabled computer could help students assess their sight-reading abilities, and could possibly advise students on how to improve. The approach could be extended to any musical instrument. For keyboard players, a MIDI keyboard with the appropriate software to provide information about note accuracy and timing could complement feedback from an eye tracker to enable more detailed analysis and advice.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124864077","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"News stories relevance effects on eye-movements","authors":"J. Gwizdka","doi":"10.1145/2578153.2578198","DOIUrl":"https://doi.org/10.1145/2578153.2578198","url":null,"abstract":"Relevance is a fundamental concept in information retrieval. We consider relevance from the user's perspective and ask if the degree of relevance can be inferred from eye-tracking data and if it is related to the cognitive effort involved in relevance judgments. To this end we conducted a study, in which participants were asked to find information in screen-long text documents containing news stories. Each participant responded to fourteen trials consisting of an information question followed by three documents each at a different level of relevance (irrelevant, partially relevant, and relevant). The results indicate that relevant documents tended to be continuously read, while irrelevant documents tended to be scanned. In most cases, cognitive effort inferred from eye-tracking data was highest for partially relevant documents and lowest for irrelevant documents.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125394725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A mixture distribution for visual foraging","authors":"P. Sarma, Tarunraj Singh","doi":"10.1145/2578153.2578210","DOIUrl":"https://doi.org/10.1145/2578153.2578210","url":null,"abstract":"Visual foraging is investigated by examining the nature of statistical distributions underlying human search strategies. Eye movements uninfluenced by scene perception or higher level cognition tasks are used to generate a data set which can be analyzed to study 'pure' searches. Eye movements in the form of 'jump' length constituting the entire search process are studied to detect the presence of statistical distributions whose parameters can be estimated. Animal ecology studies have reported the presence of a Lèvy flight/power law model, which explains animal foraging patterns in few species. We consider a Lèvy flight model to explain visual foraging. Results from data analysis, while not ruling out the presence of a power law entirely, point strongly towards the presence of a mixture distribution which faithfully explains visual foraging. This mixture distribution is made up of gamma distributions.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"187 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116983604","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Eye tracking gaze visualiser: eye tracker and experimental software independent visualisation of gaze data","authors":"B. Fehringer","doi":"10.1145/2578153.2578191","DOIUrl":"https://doi.org/10.1145/2578153.2578191","url":null,"abstract":"Eye tracking research in disciplines such as cognitive psychology requires specific software packages designed for experiments supporting reaction time measurement, blocking and mixing of conditions and item randomisation. Although recording raw eye movement data is possible, its visualisation is difficult regarding the experimental design. The currently used eye tracking software is often built as an all-in-one program that can only visualise the eye tracking data recorded by itself. Therefore, in this paper a software tool is presented that visualises nearly any recorded eye tracking gaze data on the corresponding video independent of the specific software that runs the experiment. Summarised visualisations over randomised item presentations according to experimental conditions can be created. In addition to basic visualisation functionalities, further features such as simple object detection, repetitive pattern exploration and subset selection of subjects are provided.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124106840","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"ISeeCube: visual analysis of gaze data for video","authors":"K. Kurzhals, Florian Heimerl, D. Weiskopf","doi":"10.1145/2578153.2578158","DOIUrl":"https://doi.org/10.1145/2578153.2578158","url":null,"abstract":"We introduce a new design for the visual analysis of eye tracking data recorded from dynamic stimuli such as video. ISeeCube includes multiple coordinated views to support different aspects of various analysis tasks. It combines methods for the spatiotemporal analysis of gaze data recorded from unlabeled videos as well as the possibility to annotate and investigate dynamic Areas of Interest (AOIs). A static overview of the complete data set is provided by a space-time cube visualization that shows gaze points with density-based color mapping and spatiotemporal clustering of the data. A timeline visualization supports the analysis of dynamic AOIs and the viewers' attention on them. AOI-based scanpaths of different viewers can be clustered by their Levenshtein distance, an attention map, or the transitions between AOIs. With the provided visual analytics techniques, the exploration of eye tracking data recorded from several viewers is supported for a wide range of analysis tasks.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"39 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127986134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Thomas C. Kübler, Enkelejda Kasneci, W. Rosenstiel
{"title":"Gaze guidance for the visually impaired","authors":"Thomas C. Kübler, Enkelejda Kasneci, W. Rosenstiel","doi":"10.1145/2578153.2583038","DOIUrl":"https://doi.org/10.1145/2578153.2583038","url":null,"abstract":"Visual perception is perhaps the most important sensory input. During driving, about 90% of the relevant information is related to the visual input [Taylor 1982]. However, the quality of visual perception decreases with age, mainly related to a reduce in the visual acuity or in consequence of diseases affecting the visual system. Amongst the most severe types of visual impairments are visual field defects (areas of reduced perception in the visual field), which occur as a consequence of diseases affecting the brain, e.g., stroke, brain injury, trauma, or diseases affecting the optic nerve, e.g., glaucoma. Due to demographic aging, the number of people with such visual impairments is expected to rise [Kasneci 2013]. Since persons suffering from visual impairments may overlook hazardous objects, they are prohibited from driving. This, however, leads to a decrease in quality of life, mobility, and participation in social life. Several studies have shown that some patients show a safe driving behavior despite their visual impairment by performing effective visual exploration, i.e., adequate eye and head movements (e.g., towards their visual field defect [Kasneci et al. 2014b]). Thus, a better understanding of visual perception mechanisms, i.e., of why and how we attend certain parts of our environment while \"ignoring\" others, is a key question to helping visually impaired persons in complex, real-life tasks, such as driving a car.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115372066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Brendan David-John, S. Sridharan, Reynold J. Bailey
{"title":"Collaborative eye tracking for image analysis","authors":"Brendan David-John, S. Sridharan, Reynold J. Bailey","doi":"10.1145/2578153.2578215","DOIUrl":"https://doi.org/10.1145/2578153.2578215","url":null,"abstract":"We present a framework for collaborative image analysis where gaze information is shared across all users. A server gathers and broadcasts fixation data from/to all clients and the clients visualize this information. Several visualization options are provided. The system can run in real-time or gaze information can be recorded and shared the next time an image is accessed. Our framework is scalable to large numbers of clients with different eye tracking devices. To evaluate our system we used it within the context of a spot-the-differences game. Subjects were presented with 10 image pairs each containing 5 differences. They were given one minute to detect the differences in each image. Our study was divided into three sessions. In session 1, subjects completed the task individually, in session 2, pairs of subjects completed the task without gaze sharing, and in session 3, pairs of subjects completed the task with gaze sharing. We measured accuracy, time-to-completion and visual coverage over each image to evaluate the performance of subjects in each session. We found that visualizing shared gaze information by graying out previously scrutinized regions of an image significantly increases the dwell time in the areas of the images that are relevant to the task (i.e. the regions where differences actually occurred). Furthermore, accuracy and time-to-completion also improved over collaboration without gaze sharing though the effects were not significant. Our framework is useful for a wide range of image analysis applications which can benefit from a collaborative approach.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114686422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Simulating refraction and reflection of ocular surfaces for algorithm validation in outdoor mobile eye tracking videos","authors":"Thomas B. Kinsman, J. Pelz","doi":"10.1145/2578153.2578203","DOIUrl":"https://doi.org/10.1145/2578153.2578203","url":null,"abstract":"To create input videos for testing pupil detection algorithms for outdoor eye tracking, we develop a simulation of the eye with front-surface reflections of the cornea and the internal refractions of the cornea and refraction at the air/cornea and cornea/aqueous boundaries. The scene and iris are simulated using texture mapping and are alpha-blended to produce the final image of the eye with reflections and refractions. The simulation of refraction is important in order to observe the elliptical shape that the pupil takes on as it goes off axis, and to take into consideration the difference between true pupil position and apparent (entrance) pupil position. Sequences of images are combined to produce input videos for testing the next generation of pupil detection and tracking algorithms, which must sort the pupil out of distracting edges and reflected objects.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130645113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Rachel Turner, Michael Falcone, Bonita Sharif, A. Lazar
{"title":"An eye-tracking study assessing the comprehension of c++ and Python source code","authors":"Rachel Turner, Michael Falcone, Bonita Sharif, A. Lazar","doi":"10.1145/2578153.2578218","DOIUrl":"https://doi.org/10.1145/2578153.2578218","url":null,"abstract":"A study to assess the effect of programming language on student comprehension of source code is presented, comparing the languages of C++ and Python in two task categories: overview and find bug tasks. Eye gazes are tracked while thirty-eight students complete tasks and answer questions. Results indicate no significant difference in accuracy or time, however there is a significant difference reported on the rate at which students look at buggy lines of code. These results start to provide some direction as to the effect programming language might have in introductory programming classes.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130686405","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}