Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
Modified DBSCAN algorithm on oculomotor fixation identification 眼动注视识别的改进DBSCAN算法
Beibin Li, Quan Wang, E. Barney, Logan Hart, Carla A. Wall, K. Chawarska, I. S. D. Urabain, T. J. Smith, F. Shic
{"title":"Modified DBSCAN algorithm on oculomotor fixation identification","authors":"Beibin Li, Quan Wang, E. Barney, Logan Hart, Carla A. Wall, K. Chawarska, I. S. D. Urabain, T. J. Smith, F. Shic","doi":"10.1145/2857491.2888587","DOIUrl":"https://doi.org/10.1145/2857491.2888587","url":null,"abstract":"This paper modifies the DBSCAN algorithm to identify fixations and saccades. This method combines advantages from dispersion-based algorithms, such as resilience to noise and intuitive fixational structure, and from velocity-based algorithms, such as the ability to deal appropriately with smooth pursuit (SP) movements.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125301534","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Design and validation of a simple eye-tracking system 一个简单眼动追踪系统的设计与验证
D. Liston, Sol Simpson, Lily R. Wong, M. Rich, L. Stone
{"title":"Design and validation of a simple eye-tracking system","authors":"D. Liston, Sol Simpson, Lily R. Wong, M. Rich, L. Stone","doi":"10.1145/2857491.2857534","DOIUrl":"https://doi.org/10.1145/2857491.2857534","url":null,"abstract":"To address the need for portable systems to collect high-quality eye movement data for field studies, this paper shows how one might design, test, and validate the spatiotemporal fidelity of a homebrewed eye-tracking system. To assess spatial and temporal precision, we describe three validation tests that quantify the spatial resolution and temporal synchronization of data acquisition. First, because measurement of pursuit eye movements requires a visual motion display, we measured the timing of luminance transitions of several candidate LCD monitors so as to ensure sufficient stimulus fidelity. Second, we measured eye position as human observers (n=20) ran a nine-point calibration in a clinical-grade chin rest, delivering eye-position noise of 0.22 deg (range: 0.09-0.29 deg) and accuracy of 0.97 deg (range: 0.54-1.89 deg). Third, we measured the overall processing delay in the system to be 5.6 ms, accounted for by the response dynamics of our monitor and the duration of one camera frame. The validation methods presented can be used: 1) to ensure that eye-position accuracy and precision are sufficient to support scientific and clinical studies and are not limited by the hardware or software, and 2) the eyetracker, display, and experiment-control software are effectively synchronized.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132514201","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Diagnosis of spatial thinking using eye tracking and pupilometry 用眼动追踪和瞳孔测量诊断空间思维
B. Fehringer
{"title":"Diagnosis of spatial thinking using eye tracking and pupilometry","authors":"B. Fehringer","doi":"10.1145/2857491.2888585","DOIUrl":"https://doi.org/10.1145/2857491.2888585","url":null,"abstract":"The aim of the intended dissertation study is to show the diagnostic potential of eye tracking for a spatial thinking test. To this end, a structural overview of different analyzing techniques for eye tracking data will be provided using several measures. For a new developed test for the spatial cognitive ability visualization, the results of the analyzed eye tracking data will be linked to reaction time, accuracy and associated cognitive processes. It is intended to explore which information can be obtained by pupilometry and the systematic combination of the dimensions of eye movement data (location and time). As indicators for cognitive processes and cognitive workload, the resulting gaze patterns and the computed Index of Cognitive Activity, ICA [Marshall 2002] will be connected to the participant's performance in a test of the spatial ability factor visualization. The results will contribute to the question what eye behavioral measures are able to predict participants' abilities and provide insights into associated cognitive processes.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130076181","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Pupil size as an indicator of neurochemical activity during learning 瞳孔大小作为学习过程中神经化学活动的指标
R. C. Hoffing, A. Seitz
{"title":"Pupil size as an indicator of neurochemical activity during learning","authors":"R. C. Hoffing, A. Seitz","doi":"10.1145/2857491.2888586","DOIUrl":"https://doi.org/10.1145/2857491.2888586","url":null,"abstract":"Neurochemical systems are well studied in animal learning; however, ethical issues limit methodologies to explore these systems in humans. Pupillometry provides a glimpse into the brain's neurochemical systems, where pupil dynamics in monkeys have been linked with locus coeruleus (LC) activity, which releases norepinephrine (NE) throughout the brain. The objective of my research is to understand the role of neurochemicals in human learning. Specifically, I aim to 1) Establish a non-invasive method to study the role of neurochemicals in human learning, 2) Develop methods to monitor learning in real time using pupillometry, and 3) Discover causal relationships between neurochemicals and learning in human subjects. In this article, to address Objective 1, we present evidence that pupil dynamics can be used as a surrogate measure of neurochemical activity during learning. Specifically, we hypothesize that norepinephrine modulates the encoding of memories, the influence of which can be measured with pupil dynamics. To examine this hypothesis a task-irrelevant learning paradigm was used, in which learning is boosted for stimuli temporally paired with task targets. We show that participants better recognize images that are paired with task targets than distractors and, in correspondence, that pupil size changes more for target-paired than distractor-paired images. To further investigate the hypothesis that NE nonspecifically guides learning for stimuli that are present with its release, a second procedure was used that employed an unexpected sound to activate the LC--NE system and induce pupil-size changes; results indicated a corresponding increase in memorization of images paired with the unexpected sounds. Together, these results suggest a relationship between the LC--NE system, pupil-size changes, and learning. My ongoing work aims to develop methods to monitor learning in real time by investigating the relationship between, pupil size changes, eye movement and learning in context of a free visual search task. Future work will investigate the causal relationship between neurochemicals, learning and pupil dynamics by using NE specific drugs to up- and down-regulate levels of NE during learning.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134014053","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Heat map visualization of multi-slice medical images through correspondence matching of video frames 基于视频帧对应匹配的多层医学图像热图可视化
Divesh Lala, A. Nakazawa
{"title":"Heat map visualization of multi-slice medical images through correspondence matching of video frames","authors":"Divesh Lala, A. Nakazawa","doi":"10.1145/2857491.2857504","DOIUrl":"https://doi.org/10.1145/2857491.2857504","url":null,"abstract":"Visual inspection of medical imagery such as MRI and CT scans is a major task for medical professionals who must diagnose and treat patients without error. Given this goal, visualizing search behavior patterns used to recognize abnormalities in these images is of interest. In this paper we describe the development of a system which automatically generates multiple image-dependent heat maps from eye gaze data of users viewing medical image slices. This system only requires the use of a non-wearable eye gaze tracker and video capturing system. The main automated features are the identification of a medical image slice located inside a video frame and calculation of the correspondence between display screen and raw image eye gaze locations. We propose that the system can be used for eye gaze analysis and diagnostic training in the medical field.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131728802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
FixFix: fixing the fixations FixFix:修复固定
Goran Topic, Akito Yamaya, Akiko Aizawa, Pascual Martínez-Gómez
{"title":"FixFix: fixing the fixations","authors":"Goran Topic, Akito Yamaya, Akiko Aizawa, Pascual Martínez-Gómez","doi":"10.1145/2857491.2884060","DOIUrl":"https://doi.org/10.1145/2857491.2884060","url":null,"abstract":"FixFix is a web-based tool for editing reading gaze fixation datasets. The purpose is to provide gaze researchers focusing on reading an easy-to-use interface that will facilitate manual interpretation, but even more so to create gold standard datasets for machine learning and data mining. It allows the users to identify fixations, then move them either singly or in groups, in order to correct both variable and systematic gaze sampling errors.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"313 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133434323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Towards automating fixation correction for source code 实现源代码的自动固定校正
Christopher Palmer, Bonita Sharif
{"title":"Towards automating fixation correction for source code","authors":"Christopher Palmer, Bonita Sharif","doi":"10.1145/2857491.2857544","DOIUrl":"https://doi.org/10.1145/2857491.2857544","url":null,"abstract":"During eye-tracking studies there is a possibility for the actual fixation to shift a little when recorded. The cause of this shift could be due to various reasons such as the accuracy of the calibration or drift. Researchers usually correct fixations manually. Manual corrections are error prone especially if done on large samples for extended periods. There is also no guarantee that two corrections done by different people on the same data set will be consistent with each other. In order to solve this problem, we introduce an attempt at automatically correcting fixations that uses a variable offset for groups of fixations. Our focus is on source code, which is read differently than natural language requiring an algorithm that adapts to these differences. We introduce a Hill Climbing algorithm that shifts fixations to a best-fit location based on a scoring function. In order to evaluate the algorithm's effectiveness, we compare the automatically corrected fixations against a set of manually corrected ones, giving us an accuracy of 89%. These findings are discussed with additional ways to improve the algorithm.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132203016","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 17
Seamless interaction with scrolling contents on eyewear computers using optokinetic nystagmus eye movements 使用眼动震颤眼球运动与眼镜电脑上的滚动内容进行无缝交互
Shahram Jalaliniya, D. Mardanbegi
{"title":"Seamless interaction with scrolling contents on eyewear computers using optokinetic nystagmus eye movements","authors":"Shahram Jalaliniya, D. Mardanbegi","doi":"10.1145/2857491.2857539","DOIUrl":"https://doi.org/10.1145/2857491.2857539","url":null,"abstract":"In this paper we investigate the utility of an eye-based interaction technique (EyeGrip) for seamless interaction with scrolling contents on eyewear computers. EyeGrip uses Optokinetic Nystagmus (OKN) eye movements to detect object of interest among a set of scrolling contents and automatically stops scrolling for the user. We empirically evaluated the usability of EyeGrip in two different applications for eyewear computers: 1) a menu scroll viewer and 2) a Facebook newsfeed reader. The results of our study showed that the EyeGrip technique performs as good as keyboard which has long been a well-known input device. Moreover, the accuracy of the EyeGrip method for menu item selection was higher while in the Facebook study participants found keyboard more accurate.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"72 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132418608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Isolating the adaptive element of tonic convergence & divergence 分离调性收敛与发散的自适应要素
Ian M. Erkelens
{"title":"Isolating the adaptive element of tonic convergence & divergence","authors":"Ian M. Erkelens","doi":"10.1145/2857491.2888584","DOIUrl":"https://doi.org/10.1145/2857491.2888584","url":null,"abstract":"Eye movements have provided an excellent substrate with which to explore the neural control of motor systems. The simplicity of the neural circuitry and physical plants, in comparison to visually directed limb movements, allow for much easier analysis and extrapolation. The adaptive capabilities of eye movements are robust and reflect the significant neural plasticity within these systems. Although crucial for optimal motor function, these adaptive properties and the neural mechanisms responsible are only beginning to be understood. While limb and saccadic adaptations have been intensively studied, the adaptive response is measured indirectly as a change in the original response. Vergence, however, appears to provide the opportunity to measure the adaptive response in isolation. The following are preliminary results of a study investigating the adaptive properties of vergence eye movements using a main sequence analysis. The effects of stimulus directionality and amplitude are investigated and compared to the reflexive vergence innervation patterns known to exist to similar stimuli.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128803242","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Deep eye fixation map learning for calibration-free eye gaze tracking 无标定眼注视跟踪的深眼注视图学习
Kang Wang, Shen Wang, Q. Ji
{"title":"Deep eye fixation map learning for calibration-free eye gaze tracking","authors":"Kang Wang, Shen Wang, Q. Ji","doi":"10.1145/2857491.2857515","DOIUrl":"https://doi.org/10.1145/2857491.2857515","url":null,"abstract":"The existing eye trackers typically require an explicit personal calibration procedure to estimate subject-dependent eye parameters. Despite efforts in simplifying the calibration process, such a calibration process remains unnatural and bothersome, in particular for users of personal and mobile devices. To alleviate this problem, we introduce a technique that can eliminate explicit personal calibration. Based on combining a new calibration procedure with the eye fixation prediction, the proposed method performs implicit personal calibration without active participation or even knowledge of the user. Specifically, different from traditional deterministic calibration procedure that minimizes the differences between the predicted eye gazes and the actual eye gazes, we introduce a stochastic calibration procedure that minimizes the differences between the probability distribution of the predicted eye gaze and the distribution of the actual eye gaze. Furthermore, instead of using saliency map to approximate eye fixation distribution, we propose to use a regression based deep convolutional neural network (RCNN) that specifically learns image features to predict eye fixation. By combining the distribution based calibration with the deep fixation prediction procedure, personal eye parameters can be estimated without explicit user collaboration. We apply the proposed method to both 2D regression-based and 3D model-based eye gaze tracking methods. Experimental results show that the proposed method outperforms other implicit calibration methods and achieve comparable results to those that use traditional explicit calibration methods.","PeriodicalId":245338,"journal":{"name":"Proceedings of the Ninth Biennial ACM Symposium on Eye Tracking Research & Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-03-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124905543","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 47
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信