Proceedings of the Symposium on Eye Tracking Research and Applications最新文献

筛选
英文 中文
Self-localization using fixations as landmarks 使用固定点作为地标进行自我定位
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168581
Lisa M. Tiberio, R. Canosa
{"title":"Self-localization using fixations as landmarks","authors":"Lisa M. Tiberio, R. Canosa","doi":"10.1145/2168556.2168581","DOIUrl":"https://doi.org/10.1145/2168556.2168581","url":null,"abstract":"Self-localization is the process of knowing your position and location relative to your surroundings. This research integrated artificial intelligence techniques into a custom-built portable eye tracker for the purpose of automating the process of determining indoor self-localization. Participants wore the eye tracker and walked a series of corridors while a video of the scene was recorded along with fixation locations. Patches of the scene video without fixation information were used to train the classifier by creating feature maps of the corridors. For testing the classifier, fixation locations in the scene were extracted and used to determine the location of the participant. Scene patches surrounding fixations were used for the classification instead of objects in the environment. This eliminated the need for complex computer vision object recognition algorithms and made scene classification less dependent upon objects and their placement in the environment. This allowed for a sparse representation of the scene since image processing to detect and recognize objects was not necessary to determine location. Experimentally, image patches surrounding fixations were found to be a highly reliable indicator of location, as compared to random image patches, non-fixated salient image patches, or other non-salient scene locations. In some cases, only a single fixation was needed to accurately identify the correct location of the participant. To the best of our knowledge, this technique has not been used before for determining human self-localization in either indoor or outdoor settings.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117175837","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Gaming with gaze and losing with a smile 用凝视玩游戏,用微笑输
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168638
Anders Møller Nielsen, Anders Petersen, J. P. Hansen
{"title":"Gaming with gaze and losing with a smile","authors":"Anders Møller Nielsen, Anders Petersen, J. P. Hansen","doi":"10.1145/2168556.2168638","DOIUrl":"https://doi.org/10.1145/2168556.2168638","url":null,"abstract":"This paper presents an experiment comparing performance and user experience of gaze and mouse interaction in a minimalistic 3D flying game that only required steering. Mouse interaction provided better performance and participants considered it less physical and mental demanding, less frustrating and less difficult to maneuver. Gaze interaction, however, yielded higher levels of entertainment and engagement. The paper suggests that gaze steering provides a high kinesthetic pleasure both because it is difficult to master and because it presents a unique mapping between fixation and locomotion.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127500874","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Entropy-based correction of eye tracking data for static scenes 基于熵的静态场景眼动追踪数据校正
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168620
Samuel John, E. Weitnauer, Hendrik Koesling
{"title":"Entropy-based correction of eye tracking data for static scenes","authors":"Samuel John, E. Weitnauer, Hendrik Koesling","doi":"10.1145/2168556.2168620","DOIUrl":"https://doi.org/10.1145/2168556.2168620","url":null,"abstract":"In a typical head-mounted eye tracking system, any small slippage of the eye tracker headband on the participant's head leads to a systematic error in the recorded gaze positions. While various approaches exist that reduce these errors at recording time, only few methods reduce the errors of a given tracking system after recording. In this paper we introduce a novel correction algorithm that can significantly reduce the drift in recorded gaze data for eye tracking experiments that use static stimuli. The algorithm is entropy-based and needs no prior knowledge about the stimuli shown or the tasks participants accomplish during the experiment.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122730578","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Error characterization and compensation in eye tracking systems 眼动追踪系统中的误差表征与补偿
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168595
Juan J. Cerrolaza, A. Villanueva, Maria Villanueva, R. Cabeza
{"title":"Error characterization and compensation in eye tracking systems","authors":"Juan J. Cerrolaza, A. Villanueva, Maria Villanueva, R. Cabeza","doi":"10.1145/2168556.2168595","DOIUrl":"https://doi.org/10.1145/2168556.2168595","url":null,"abstract":"The development of systems that track the eye while allowing head movement is one of the most challenging objectives of gaze tracking researchers. Tracker accuracy decreases as the subject moves from the calibration position and is especially influenced by changes in depth with respect to the screen. In this paper, we demonstrate that the pattern of error produced due to user movement mainly depends on the system configuration and hardware element placement rather than the user. Thus, we suggest alternative calibration techniques for error reduction that compensate for the lack of accuracy due to subject movement. Using these techniques, we can achieve an error reduction of more than 50%.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122869438","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 53
Eye-based head gestures 基于眼睛的头部手势
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168578
D. Mardanbegi, D. Hansen, Thomas Pederson
{"title":"Eye-based head gestures","authors":"D. Mardanbegi, D. Hansen, Thomas Pederson","doi":"10.1145/2168556.2168578","DOIUrl":"https://doi.org/10.1145/2168556.2168578","url":null,"abstract":"A novel method for video-based head gesture recognition using eye information by an eye tracker has been proposed. The method uses a combination of gaze and eye movement to infer head gestures. Compared to other gesture-based methods a major advantage of the method is that the user keeps the gaze on the interaction object while interacting. This method has been implemented on a head-mounted eye tracker for detecting a set of predefined head gestures. The accuracy of the gesture classifier is evaluated and verified for gaze-based interaction in applications intended for both large public displays and small mobile phone screens. The user study shows that the method detects a set of defined gestures reliably.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124677310","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 73
A robust realtime reading-skimming classifier 一个鲁棒的实时阅读略读分类器
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168575
R. Biedert, Jörn Hees, A. Dengel, Georg Buscher
{"title":"A robust realtime reading-skimming classifier","authors":"R. Biedert, Jörn Hees, A. Dengel, Georg Buscher","doi":"10.1145/2168556.2168575","DOIUrl":"https://doi.org/10.1145/2168556.2168575","url":null,"abstract":"Distinguishing whether eye tracking data reflects reading or skimming already proved to be of high analytical value. But with a potentially more widespread usage of eye tracking systems at home, in the office or on the road the amount of environmental and experimental control tends to decrease. This in turn leads to an increase in eye tracking noise and inaccuracies which are difficult to address with current reading detection algorithms. In this paper we propose a method for constructing and training a classifier that is able to robustly distinguish reading from skimming patterns. It operates in real time, considering a window of saccades and computing features such as the average forward speed and angularity. The algorithm inherently deals with distorted eye tracking data and provides a robust, linear classification into the two classes read and skimmed. It facilitates reaction times of 750ms on average, is adjustable in its horizontal sensitivity and provides confidence values for its classification results; it is also straightforward to implement. Trained on a set of six users and evaluated on an independent test set of six different users it achieved a 86% classification accuracy and it outperformed two other methods.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130263487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 73
Aggregate gaze visualization with real-time heatmaps 聚合凝视可视化与实时热图
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168558
A. Duchowski, Margaux M. Price, Miriah D. Meyer, P. Orero
{"title":"Aggregate gaze visualization with real-time heatmaps","authors":"A. Duchowski, Margaux M. Price, Miriah D. Meyer, P. Orero","doi":"10.1145/2168556.2168558","DOIUrl":"https://doi.org/10.1145/2168556.2168558","url":null,"abstract":"A GPU implementation is given for real-time visualization of aggregate eye movements (gaze) via heatmaps. Parallelization of the algorithm leads to substantial speedup over its CPU-based implementation and, for the first time, allows real-time rendering of heatmaps atop video. GLSL shader colorization allows the choice of color ramps. Several luminance-based color maps are advocated as alternatives to the popular rainbow color map, considered inappropriate (harmful) for depiction of (relative) gaze distributions.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"74 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129551517","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 69
Bayesian online clustering of eye movement data 眼动数据的贝叶斯在线聚类
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168617
Enkelejda Kasneci, G. Kasneci, W. Rosenstiel, M. Bogdan
{"title":"Bayesian online clustering of eye movement data","authors":"Enkelejda Kasneci, G. Kasneci, W. Rosenstiel, M. Bogdan","doi":"10.1145/2168556.2168617","DOIUrl":"https://doi.org/10.1145/2168556.2168617","url":null,"abstract":"The task of automatically tracking the visual attention in dynamic visual scenes is highly challenging. To approach it, we propose a Bayesian online learning algorithm. As the visual scene changes and new objects appear, based on a mixture model, the algorithm can identify and tell visual saccades (transitions) from visual fixation clusters (regions of interest). The approach is evaluated on real-world data, collected from eye-tracking experiments in driving sessions.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"163 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127196810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 82
The precision of eye-trackers: a case for a new measure 眼球追踪仪的精确性:一项新措施的案例
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168618
P. Blignaut, T. Beelders
{"title":"The precision of eye-trackers: a case for a new measure","authors":"P. Blignaut, T. Beelders","doi":"10.1145/2168556.2168618","DOIUrl":"https://doi.org/10.1145/2168556.2168618","url":null,"abstract":"Several possible measures for the precision of an eye-tracker exist. The fact that the commonly used measures of standard deviation and RMS lack with respect to their ability to produce replicable results with varying frame rate, gaze distance and arrangement of samples within a fixation, makes it difficult to compare eye-trackers. It is proposed that an area-based measure, BCEA, is adapted to provide a one dimensional quantity that is intuitive, independent of frame rate and sensitive to small jerks in the reported fixation position.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130044029","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 34
Simple gaze gestures and the closure of the eyes as an interaction technique 简单的凝视手势和闭上眼睛作为一种互动技术
Proceedings of the Symposium on Eye Tracking Research and Applications Pub Date : 2012-03-28 DOI: 10.1145/2168556.2168579
Henna Heikkilä, Kari-Jouko Räihä
{"title":"Simple gaze gestures and the closure of the eyes as an interaction technique","authors":"Henna Heikkilä, Kari-Jouko Räihä","doi":"10.1145/2168556.2168579","DOIUrl":"https://doi.org/10.1145/2168556.2168579","url":null,"abstract":"We created a set of gaze gestures that utilize the following three elements: simple one-segment gestures, off-screen space, and the closure of the eyes. These gestures are to be used as the moving tool in a gaze-only controlled drawing application. We tested our gaze gestures with 24 participants and analyzed the gesture durations, the accuracy of the stops, and the gesture performance. We found that the difference in gesture durations between short and long gestures was so small that there is no need to choose between them. The stops made by closing both eyes were accurate, and the input method worked well for this purpose. With some adjustments and with the possibility for personal settings, the gesture performance and the accuracy of the stops can become even better.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2012-03-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128909131","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 38
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信