{"title":"Eye movements during reading and reading assessment in swedish school children: a new window on reading difficulties","authors":"Andrea Strandberg","doi":"10.1145/3314111.3322878","DOIUrl":"https://doi.org/10.1145/3314111.3322878","url":null,"abstract":"Research during the last decades has demonstrated that eye tracking methodology is an advantageous tool to study reading. A substantial amount of eye movement research has resulted in improved understanding of the reading process in skilled adult readers. Considerably fewer eye tracking studies have examined reading and its development in children. In this doctoral project, eye movements during reading and reading skill are investigated in a population based sample of Swedish elementary school children. The aims are to provide evidence from a large scale study and to explore the concurrent development of reading eye movements and reading skill. In the first study, we describe the eye movement variables across the grades and their connection to assessment on phonemic awareness, decoding strategies and processing speed. During the remainder of the current project, we will focus on longitudinal aspects, as the participants were recorded twice with a one-year-interval. Further, we will examine possible predictors of later reading skill among the eye movement variables and reading assessment outcomes.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127318921","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Microsaccadic and pupillary response to tactile task difficulty","authors":"Justyna Zurawska","doi":"10.1145/3314111.3322875","DOIUrl":"https://doi.org/10.1145/3314111.3322875","url":null,"abstract":"The research goal is to explore the relationship between eye tracking measures and a tactile version of the n-back task. The n-back task is often used to evoke cognitive load, however this is the first study that incorporates tactile stimulus as input. The study follows a within-subject design with easy and difficult experimental conditions. In the tactile n-back task, each participant will be asked to identify the number of pins felt under the fingertips. In the easy condition, each participant will then be asked to respond if a number shown on the computer screen is congruent with the number of recognized pins. In the difficult condition, each participant will be asked to refer to the pin number in the current trial and the previous trial. Microsaccades and pupil dilation will be recorded during the top-down process of performing the n-back task.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127361547","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A deep learning approach for robust head pose independent eye movements recognition from videos","authors":"Rémy Siegfried, Yu Yu, J. Odobez","doi":"10.1145/3314111.3319844","DOIUrl":"https://doi.org/10.1145/3314111.3319844","url":null,"abstract":"Recognizing eye movements is important for gaze behavior understanding like in human communication analysis (human-human or robot interactions) or for diagnosis (medical, reading impairments). In this paper, we address this task using remote RGB-D sensors to analyze people behaving in natural conditions. This is very challenging given that such sensors have a normal sampling rate of 30 Hz and provide low-resolution eye images (typically 36×60 pixels), and natural scenarios introduce many variabilities in illumination, shadows, head pose, and dynamics. Hence gaze signals one can extract in these conditions have lower precision compared to dedicated IR eye trackers, rendering previous methods less appropriate for the task. To tackle these challenges, we propose a deep learning method that directly processes the eye image video streams to classify them into fixation, saccade, and blink classes, and allows to distinguish irrelevant noise (illumination, low-resolution artifact, inaccurate eye alignment, difficult eye shapes) from true eye motion signals. Experiments on natural 4-party interactions demonstrate the benefit of our approach compared to previous methods, including deep learning models applied to gaze outputs.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127547283","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An interactive web-based visual analytics tool for detecting strategic eye movement patterns","authors":"Michael Burch, Ayush Kumar, Neil Timmermans","doi":"10.1145/3317960.3321615","DOIUrl":"https://doi.org/10.1145/3317960.3321615","url":null,"abstract":"In this paper we describe an interactive and web-based visual analytics tool combining linked visualization techniques and algorithmic approaches for exploring the hierarchical visual scanning behavior of a group of people when solving tasks in a static stimulus. This has the benefit that the recorded eye movement data can be observed in a more structured way with the goal to find patterns in the common scanning behavior of a group of eye tracked people. To reach this goal we first preprocess and aggregate the scanpaths based on formerly defined areas of interest (AOIs) which generates a weighted directed graph. We visually represent the resulting AOI graph as a modified hierarchical graph layout. This can be used to filter and navigate in the eye movement data shown in a separate view overplotted on the stimulus for preserving the mental map and for providing an intuitive view on the semantics of the original stimulus. Several interaction techniques and complementary views with visualizations are implemented. Moreover, due to the web-based nature of the tool, users can upload, share, and explore data with others. To illustrate the usefulness of our concept we apply it to real-world eye movement data from a formerly conducted eye tracking experiment.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123464905","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Agnieszka Ozimek, Paulina Lewandowska, Krzysztof Krejtz, A. Duchowski
{"title":"Attention towards privacy notifications on web pages","authors":"Agnieszka Ozimek, Paulina Lewandowska, Krzysztof Krejtz, A. Duchowski","doi":"10.1145/3317960.3321618","DOIUrl":"https://doi.org/10.1145/3317960.3321618","url":null,"abstract":"A significant number of Internet users is unaware of privacy threats and they might not pay much attention to privacy notifications. It is imperative to create effective but, at the same time, non-distracting notifications about privacy policy of different web services. The article presents an eye-tracking study (N = 16) testing the effectiveness of different common types of privacy notification in capturing and retaining users' attention. Results showed that an non-intrusive notification presented at the top of the web page may be as effective as intrusive notifications in capturing attention. Results are discussed in terms of well-known effects of visual attention distribution on web pages. Present findings also offer practical conclusions for web service developers and publishers.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"97 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116441628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Norick R. Bowers, A. Gibaldi, Emma Alexander, M. Banks, A. Roorda
{"title":"High-resolution eye tracking using scanning laser ophthalmoscopy","authors":"Norick R. Bowers, A. Gibaldi, Emma Alexander, M. Banks, A. Roorda","doi":"10.1145/3314111.3322877","DOIUrl":"https://doi.org/10.1145/3314111.3322877","url":null,"abstract":"Current eye-tracking techniques rely primarily on video-based tracking of components of the anterior surfaces of the eye. However, these trackers have several limitations. Their limited resolution precludes study of small fixational eye motion. Furthermore, many of these trackers rely on calibration procedures that do not offer a way to validate their eye motion traces. By comparison, retinal-image-based trackers can track the motion of the retinal image directly, at frequencies greater than 1kHz and with subarcminute accuracy. The retinal image provides a way to validate the eye position at any point in time, offering an unambiguous record of eye motion as a reference for the eye trace. The benefits of using scanning retinal imaging systems as eye trackers, however, comes at the price of different problems that are not present in video-based systems, and need to be solved to obtain robust eye traces. The current abstract provides an overview of retinal-image-based eye tracking methods, provides preliminary eye-tracking results from a tracking scanning-laser ophthalmoscope (TSLO), and proposes a new binocular line-scanning eye-tracking system.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129241007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"EyeMRTK","authors":"D. Mardanbegi, T. Pfeiffer","doi":"10.1145/3314111.3318155","DOIUrl":"https://doi.org/10.1145/3314111.3318155","url":null,"abstract":"","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"3 5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123681270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Accessible control of telepresence robots based on eye tracking","authors":"Guangtao Zhang, J. P. Hansen","doi":"10.1145/3314111.3322869","DOIUrl":"https://doi.org/10.1145/3314111.3322869","url":null,"abstract":"Gaze may be a good alternative input modality for people with limited hand mobility. This accessible control based on eye tracking can be implemented into telepresence robots, which are widely used to promote remote social interaction and providing the feeling of presence. This extended abstract introduces a Ph.D. research project, which takes a two-phase approach towards investigating gaze-controlled telepresence robots. A system supporting gaze-controlled telepresence has been implemented. However, our current findings indicate that there were still serious challenges with regard to gaze-based driving. Potential improvements are discussed, and plans for future study are also presented.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121385642","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. Majaranta, Jari Laitinen, J. Kangas, Poika Isokoski
{"title":"Inducing gaze gestures by static illustrations","authors":"P. Majaranta, Jari Laitinen, J. Kangas, Poika Isokoski","doi":"10.1145/3317956.3318151","DOIUrl":"https://doi.org/10.1145/3317956.3318151","url":null,"abstract":"In gesture-based user interfaces, the effort needed for learning the gestures is a persistent problem that hinders their adoption in products. However, people's natural gaze paths form shapes during viewing. For example, reading creates a recognizable pattern. These gaze patterns can be utilized in human-technology interaction. We experimented with the idea of inducing specific gaze patterns by static drawings. The drawings included visual hints to guide the gaze. By looking at the parts of the drawing, the user's gaze composed a gaze gesture that activated a command. We organized a proof-of-concept trial to see how intuitive the idea is. Most participants understood the idea without specific instructions already on the first round of trials. We argue that with careful design the form of objects and especially their decorative details can serve as a gaze-based user interface in smart homes and other environments of ubiquitous computing.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128172432","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"TobiiGlassesPySuite: an open-source suite for using the Tobii Pro Glasses 2 in eye-tracking studies","authors":"D. D. Tommaso, A. Wykowska","doi":"10.1145/3314111.3319828","DOIUrl":"https://doi.org/10.1145/3314111.3319828","url":null,"abstract":"In this paper we present the TobiiGlassesPySuite, an open-source suite we implemented for using the Tobii Pro Glasses 2 wearable eye-tracker in custom eye-tracking studies. We provide a platform-independent solution for controlling the device and for managing the recordings. The software consists of Python modules, integrated into a single package, accompanied by sample scripts and recordings. The proposed solution aims at providing additional methods with respect to the manufacturer's software, for allowing the users to exploit more the device's capabilities and the existing software. Our suite is available for download from the repository indicated in the paper and usable according to the terms of the GNU GPL v3.0 license.","PeriodicalId":161901,"journal":{"name":"Proceedings of the 11th ACM Symposium on Eye Tracking Research & Applications","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-06-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125619621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}