Proceedings of the Symposium on Eye Tracking Research and Applications最新文献

筛选
英文 中文
EyeTab: model-based gaze estimation on unmodified tablet computers EyeTab:未经修改的平板电脑上基于模型的凝视估计
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578185
Erroll Wood, A. Bulling
{"title":"EyeTab: model-based gaze estimation on unmodified tablet computers","authors":"Erroll Wood, A. Bulling","doi":"10.1145/2578153.2578185","DOIUrl":"https://doi.org/10.1145/2578153.2578185","url":null,"abstract":"Despite the widespread use of mobile phones and tablets, hand-held portable devices have only recently been identified as a promising platform for gaze-aware applications. Estimating gaze on portable devices is challenging given their limited computational resources, low quality integrated front-facing RGB cameras, and small screens to which gaze is mapped. In this paper we present EyeTab, a model-based approach for binocular gaze estimation that runs entirely on an unmodified tablet. EyeTab builds on set of established image processing and computer vision algorithms and adapts them for robust and near-realtime gaze estimation. A technical prototype evaluation with eight participants in a normal indoors office setting shows that EyeTab achieves an average gaze estimation accuracy of 6.88° of visual angle at 12 frames per second.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115392424","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 233
Estimating point-of-regard using corneal surface image 利用角膜表面图像估计关注点
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578197
K. Takemura, Shunki Kimura, Sara Suda
{"title":"Estimating point-of-regard using corneal surface image","authors":"K. Takemura, Shunki Kimura, Sara Suda","doi":"10.1145/2578153.2578197","DOIUrl":"https://doi.org/10.1145/2578153.2578197","url":null,"abstract":"Recently, the eye-tracker has been developed as a daily-use device. However, when an eye-tracker is used daily, the problem of calibration arises. Even when the calibration for computing the relationship between the scene and eye camera is conducted in advance, the relationship is not maintained in prolonged use. Therefore, we propose a method for conserving the relationship between the scene and eye camera during the execution of an eye-tracking program. The texture information of the corneal surface image is used to estimate the point-of-regard. We confirm the feasibility of the proposed method through preliminary experiments.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124407006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
EYEDIAP: a database for the development and evaluation of gaze estimation algorithms from RGB and RGB-D cameras EYEDIAP:用于开发和评估RGB和RGB- d相机凝视估计算法的数据库
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578190
Kenneth Alberto Funes Mora, Florent Monay, J. Odobez
{"title":"EYEDIAP: a database for the development and evaluation of gaze estimation algorithms from RGB and RGB-D cameras","authors":"Kenneth Alberto Funes Mora, Florent Monay, J. Odobez","doi":"10.1145/2578153.2578190","DOIUrl":"https://doi.org/10.1145/2578153.2578190","url":null,"abstract":"The lack of a common benchmark for the evaluation of the gaze estimation task from RGB and RGB-D data is a serious limitation for distinguishing the advantages and disadvantages of the many proposed algorithms found in the literature. This paper intends to overcome this limitation by introducing a novel database along with a common framework for the training and evaluation of gaze estimation approaches. In particular, we have designed this database to enable the evaluation of the robustness of algorithms with respect to the main challenges associated to this task: i) Head pose variations; ii) Person variation; iii) Changes in ambient and sensing conditions and iv) Types of target: screen or 3D object.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"60 9","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120998934","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 260
Towards fine-grained fixation analysis: distilling out context dependence 走向细粒度固定分析:剔除上下文依赖
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578167
Neil D. B. Bruce
{"title":"Towards fine-grained fixation analysis: distilling out context dependence","authors":"Neil D. B. Bruce","doi":"10.1145/2578153.2578167","DOIUrl":"https://doi.org/10.1145/2578153.2578167","url":null,"abstract":"In this paper, we explore the problem of analyzing gaze patterns towards attributing greater meaning to observed fixations. In recent years, there have been a number of efforts that attempt to categorize fixations according to their properties. Given that there are a multitude of factors that may contribute to fixational behavior, including both bottom-up and top-down influences on neural mechanisms for visual representation and saccadic control, efforts to better understand factors that may contribute to any given fixation may play an important role in augmenting raw fixation data. A grand objective of this line of thinking is in explaining the reason for any observed fixation as a combination of various latent factors. In the current work, we do not seek to solve this problem in general, but rather to factor out the role of the holistic structure of a scene as one observable, and quantifiable factor that plays a role in determining fixational behavior. Statistical methods and approximations to achieve this are presented, and supported by experimental results demonstrating the efficacy of the proposed methods.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125301774","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Haptic feedback to gaze events 凝视事件的触觉反馈
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578154
J. Kangas, Jussi Rantala, P. Majaranta, Poika Isokoski, R. Raisamo
{"title":"Haptic feedback to gaze events","authors":"J. Kangas, Jussi Rantala, P. Majaranta, Poika Isokoski, R. Raisamo","doi":"10.1145/2578153.2578154","DOIUrl":"https://doi.org/10.1145/2578153.2578154","url":null,"abstract":"Eye tracking input often relies on visual and auditory feedback. Haptic feedback offers a previously unused alternative to these established methods. We describe a study to determine the natural time limits for haptic feedback to gazing events. The target is to determine how much time we can use to evaluate the user gazed object and decide if we are going to give the user a haptic notification on that object or not. The results indicate that it is best to get feedback faster than in 250 milliseconds from the start of fixation of an object. Longer delay leads to increase in incorrect associations between objects and the feedback. Delays longer than 500 milliseconds were confusing for the user.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122782766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Comparing estimated gaze depth in virtual and physical environments 比较虚拟和物理环境下估计的凝视深度
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578168
A. Duchowski, D. House, Jordan Gestring, Robert Congdon, Lech Swirski, N. Dodgson, Krzysztof Krejtz, I. Krejtz
{"title":"Comparing estimated gaze depth in virtual and physical environments","authors":"A. Duchowski, D. House, Jordan Gestring, Robert Congdon, Lech Swirski, N. Dodgson, Krzysztof Krejtz, I. Krejtz","doi":"10.1145/2578153.2578168","DOIUrl":"https://doi.org/10.1145/2578153.2578168","url":null,"abstract":"We show that the error in 3D gaze depth (vergence) estimated from binocularly-tracked gaze disparity is related to the viewing distance of the screen calibration plane at which 2D gaze is recorded. In a stereoscopic (virtual) environment, this relationship is evident in gaze to target depth error: vergence error behind the screen is greater than in front of the screen and is lowest at the screen depth. In a physical environment, with no accommodation-vergence conflict, the magnitude of vergence error in front of the 2D calibration plane appears reversed, increasing with distance from the viewer.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"281 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122940341","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 31
ISeeCube: visual analysis of gaze data for video ISeeCube:视频凝视数据的视觉分析
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2628812
K. Kurzhals, Florian Heimerl, D. Weiskopf
{"title":"ISeeCube: visual analysis of gaze data for video","authors":"K. Kurzhals, Florian Heimerl, D. Weiskopf","doi":"10.1145/2578153.2628812","DOIUrl":"https://doi.org/10.1145/2578153.2628812","url":null,"abstract":"We introduce a new design for the visual analysis of eye tracking data recorded from dynamic stimuli such as video. ISeeCube includes multiple coordinated views to support different aspects of various analysis tasks. It combines methods for the spatiotemporal analysis of gaze data recorded from unlabeled videos as well as the possibility to annotate and investigate dynamic Areas of Interest (AOIs). A static overview of the complete data set is provided by a space-time cube visualization that shows gaze points with density-based color mapping and spatiotemporal clustering of the data. A timeline visualization supports the analysis of dynamic AOIs and the viewers' attention on them. AOI-based scanpaths of different viewers can be clustered by their Levenshtein distance, an attention map, or the transitions between AOIs. With the provided visual analytics techniques, the exploration of eye tracking data recorded from several viewers is supported for a wide range of analysis tasks.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129694203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Design issues of remote eye tracking systems with large range of movement 大范围运动远程眼动追踪系统的设计问题
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578193
Laura Sesma, A. Villanueva, R. Cabeza
{"title":"Design issues of remote eye tracking systems with large range of movement","authors":"Laura Sesma, A. Villanueva, R. Cabeza","doi":"10.1145/2578153.2578193","DOIUrl":"https://doi.org/10.1145/2578153.2578193","url":null,"abstract":"One of the goals of the eye tracking community is to build systems that allow users to move freely. In general, there is a trade-off between the field of view of an eye tracking system and the gaze estimation accuracy. We aim to study how much the field of view of an eye tracking system can be increased, while maintaining acceptable accuracy. In this paper, we investigate all the issues concerning remote eye tracking systems with large range of movement in a simulated environment and we give some guidelines that can facilitate the process of designing an eye tracker. Given a desired range of movement and a working distance, we can calculate the camera focal length and sensor size or given a certain camera, we can determine the user's range of movement. The robustness against large head movement of two gaze estimation methods based on infrared light is analyzed: an interpolation and a geometrical method. We relate the accuracy of the gaze estimation methods with the image resolution around the eye area for a certain feature detector's accuracy and provide possible combinations of pixel size and focal length for different gaze estimation accuracies. Finally, we give the gaze estimation accuracy as a function of a new defined eye error, which is independent of any design parameters.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123905723","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Verbal gaze instruction matches visual gaze guidance in laparoscopic skills training 在腹腔镜技术训练中,言语凝视指导与视觉凝视指导相匹配
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578217
G. Tien, M. S. Atkins, Xianta Jiang, B. Zheng, R. Bednarik
{"title":"Verbal gaze instruction matches visual gaze guidance in laparoscopic skills training","authors":"G. Tien, M. S. Atkins, Xianta Jiang, B. Zheng, R. Bednarik","doi":"10.1145/2578153.2578217","DOIUrl":"https://doi.org/10.1145/2578153.2578217","url":null,"abstract":"Novices were trained to perform a unimanual peg transport task in a laparoscopic training box with an illuminated interior displayed on a monitor. Subjects were divided into two groups; one group was verbally instructed to direct their gaze at distant targets, while the other group had their gaze behaviour implicitly manipulated using distant target illumination. Both groups achieved similar task completion times post-training and developed peripheral vision strategies leading to delayed foveation on targets until the instrument was closer to its destination, although the ability to focus on targets earlier during manual movements as done by an expert surgeon was quickly regained by the verbal instruction group post-training. This suggests that care should be taken when employing visual attention cuing methods such as target highlighting for training eye-hand coordination skills, as simple verbal instruction may be sufficient to help trainees to adopt more expert-like gaze behaviours.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"220 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131457265","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Experts vs. novices: applying eye-tracking methodologies in colonoscopy video screening for polyp search 专家与新手:眼动追踪方法在息肉搜索结肠镜视频筛查中的应用
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2014-03-26 DOI: 10.1145/2578153.2578189
Jorge Bernal, F. Sánchez, F. Vilariño, Mirko Arnold, Anarta Ghosh, G. Lacey
{"title":"Experts vs. novices: applying eye-tracking methodologies in colonoscopy video screening for polyp search","authors":"Jorge Bernal, F. Sánchez, F. Vilariño, Mirko Arnold, Anarta Ghosh, G. Lacey","doi":"10.1145/2578153.2578189","DOIUrl":"https://doi.org/10.1145/2578153.2578189","url":null,"abstract":"We present in this paper a novel study aiming at identifying the differences in visual search patterns between physicians of diverse levels of expertise during the screening of colonoscopy videos. Physicians were clustered into two groups -experts and novices- according to the number of procedures performed, and fixations were captured by an eye-tracker device during the task of polyp search in different video sequences. These fixations were integrated into heat maps, one for each cluster. The obtained maps were validated over a ground truth consisting of a mask of the polyp, and the comparison between experts and novices was performed by using metrics such as reaction time, dwelling time and energy concentration ratio. Experimental results show a statistically significant difference between experts and novices, and the obtained maps show to be a useful tool for the characterisation of the behaviour of each group.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117060506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信