Eye Tracking Research & Application最新文献

筛选
英文 中文
Age differences in visual search for information on web pages 网页信息视觉搜索的年龄差异
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968379
S. Josephson, Michael E. Holmes
{"title":"Age differences in visual search for information on web pages","authors":"S. Josephson, Michael E. Holmes","doi":"10.1145/968363.968379","DOIUrl":"https://doi.org/10.1145/968363.968379","url":null,"abstract":"Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail permissions@acm.org. © 2004 ACM 1-58113-825-3/04/0003 $5.00 Age Differences in Visual Search for Information on Web Pages","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116296003","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Visual deictic reference in a collaborative virtual environment 协同虚拟环境中的视觉指示参考
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968369
A. Duchowski, Nathan Cournia, Brian Cumming, Daniel McCallum, A. Gramopadhye, J. Greenstein, Sajay Sadasivan, R. Tyrrell
{"title":"Visual deictic reference in a collaborative virtual environment","authors":"A. Duchowski, Nathan Cournia, Brian Cumming, Daniel McCallum, A. Gramopadhye, J. Greenstein, Sajay Sadasivan, R. Tyrrell","doi":"10.1145/968363.968369","DOIUrl":"https://doi.org/10.1145/968363.968369","url":null,"abstract":"This paper evaluates the use of Visual Deictic Reference (VDR) in Collaborative Virtual Environments (CVEs). A simple CVE capable of hosting two (or more) participants simultaneously immersed in the same virtual environment is used as the testbed. One participant's VDR, obtained by tracking the participant's gaze, is projected to co-participants' environments in real-time as a colored lightspot. We compare the VDR lightspot when it is eye-slaved to when it is head-slaved and show that an eye-slaved VDR helps disambiguate the deictic point of reference, especially during conditions when the user's line of sight is decoupled from their head direction.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127252926","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 26
Head movement estimation for wearable eye tracker 基于可穿戴眼动仪的头部运动估计
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968388
C. Rothkopf, J. Pelz
{"title":"Head movement estimation for wearable eye tracker","authors":"C. Rothkopf, J. Pelz","doi":"10.1145/968363.968388","DOIUrl":"https://doi.org/10.1145/968363.968388","url":null,"abstract":"In the study of eye movements in natural tasks, where subjects are able to freely move in their environment, it is desirable to capture a video of the surroundings of the subject not limited to a small field of view as obtained by the scene camera of an eye tracker. Moreover, recovering the head movements could give additional information about the type of eye movement that was carried out, the overall gaze change in world coordinates, and insight into high-order perceptual strategies. Algorithms for the classification of eye movements in such natural tasks could also benefit form the additional head movement data.We propose to use an omnidirectional vision sensor consisting of a small CCD video camera and a hyperbolic mirror. The camera is mounted on an ASL eye tracker and records an image sequence at 60 Hz. Several algorithms for the extraction of rotational motion from this image sequence were implemented and compared in their performance against the measurements of a Fasttrack magnetic tracking system. Using data from the eye tracker together with the data obtained by the omnidirectional image sensor, a new algorithm for the classification of different types of eye movements based on a Hidden-Markov-Model was developed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133754634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 51
ECSGlasses and EyePliances: using attention to open sociable windows of interaction ECSGlasses和EyePliances:用注意力打开社交互动的窗口
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968384
Jeffrey S. Shell, Roel Vertegaal, D. Cheng, Alexander W. Skaburskis, Changuk Sohn, A. James Stewart, Omar Aoudeh, C. Dickie
{"title":"ECSGlasses and EyePliances: using attention to open sociable windows of interaction","authors":"Jeffrey S. Shell, Roel Vertegaal, D. Cheng, Alexander W. Skaburskis, Changuk Sohn, A. James Stewart, Omar Aoudeh, C. Dickie","doi":"10.1145/968363.968384","DOIUrl":"https://doi.org/10.1145/968363.968384","url":null,"abstract":"We present ECSGlasses: wearable eye contact sensing glasses that detect human eye contact. ECSGlasses report eye contact to digital devices, appliances and EyePliances in the user's attention space. Devices use this attentional cue to engage in a more sociable process of turn taking with users. This has the potential to reduce inappropriate intrusions, and limit their disruptiveness. We describe new prototype systems, including the Attentive Messaging Service (AMS), the Attentive Hit Counter, the first person attentive camcorder eyeBlog, and an updated Attentive Cell Phone. We also discuss the potential of these devices to open new windows of interaction using attention as a communication modality. Further, we present a novel signal-encoding scheme to uniquely identify EyePliances and users wearing ECSGlasses in multiparty scenarios.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133217796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 38
Poster abstract: evaluation of hidden Markov models robustness in uncovering focus of visual attention from noisy eye-tracker data 海报摘要:隐马尔可夫模型在嘈杂眼动仪数据中发现视觉注意焦点的鲁棒性评价
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968373
Neil Cooke, M. Russell, A. Meyer
{"title":"Poster abstract: evaluation of hidden Markov models robustness in uncovering focus of visual attention from noisy eye-tracker data","authors":"Neil Cooke, M. Russell, A. Meyer","doi":"10.1145/968363.968373","DOIUrl":"https://doi.org/10.1145/968363.968373","url":null,"abstract":"Eye position, captured via an eye tracker, can uncover the focus of visual attention by classifying eye movements into fixations, pursuit or saccades [Duchowski 2003], with the former two indicating foci of visual attention. Such classification requires all other variability in eye tracking data, from sensor error to other eye movements (such as microsaccades, nystagmus and drifts) to accounted for effectively. The hidden Markov model provides a useful way of uncovering focus of visual attention from eye position when the user undertakes visually oriented tasks, allowing variability in eye tracking data to be modelled as a random variable.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115375437","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Eye tracking off the shelf 眼球追踪已经上架了
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968375
D. Hansen, D. MacKay, J. P. Hansen, M. Nielsen
{"title":"Eye tracking off the shelf","authors":"D. Hansen, D. MacKay, J. P. Hansen, M. Nielsen","doi":"10.1145/968363.968375","DOIUrl":"https://doi.org/10.1145/968363.968375","url":null,"abstract":"What if eye trackers could be downloaded and used immediately with standard cameras connected to a computer, without the need for an expert to setup the system? This has already the case for head trackers, so why not for eye trackers?Using components off-the-shelf (COTS) for camera-based eye tracking tasks has many advantages, but it certainly introduces several new problems as less assumptions on the system can be made. As a consequence of using COTS the price for eye tracking devices can be reduced while increasing the accessibility of these systems. Eye tracking based on COTS holds potential for a large number of possible applications such as in the games industry and eye typing [Majaranta and Räihä 2002]. Different cameras may be purchased depending on the need and the amount of money the user is willing to spend on the camera. In this framework it is not possible to use IR light sources and other novel engineered devices as they cannot be bought in a common hardware store. Very little control over the cameras and the geometry of the setup can be expected. The methods employed for eye tracking should therefore be able to handle changes in light conditions and image defocusing and scale changes [Hansen and Pece 2003]. On the same token pan-and-tilt cameras cannot be used, thus forcing such systems to be passive. Figure 1 shows a possible setup of a COTS-based eye tracker. When designing systems for the general public, it is unrealistic to assume that people are able to do camera calibration and make accurate setups of camera, monitor and user. Since little is known about the setup, would this then require a vast amount of calibration points needed for gaze estimation? That is, how many calibration points are really needed? Obviously the more calibration points are used the better the chances are to be able to infer the mapping from the image to gaze direction. It would even be possible to sample the entire function space provided sufficiently many calibration points are given. From the point of view of the users, a low number of calibration points is preferred as calibration may be considered as a tedious procedure. Systems that require many calibration points for every session are therefore not likely to succeed. It is also important to know the accuracy in gaze determination when using COTS to determine their applicability for various tasks.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124835873","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Coordination of component mental operations in a multiple-response task 多反应任务中各部分心理操作的协调
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968380
Shu-Chieh Wu, R. Remington
{"title":"Coordination of component mental operations in a multiple-response task","authors":"Shu-Chieh Wu, R. Remington","doi":"10.1145/968363.968380","DOIUrl":"https://doi.org/10.1145/968363.968380","url":null,"abstract":"Models of human performance typically focus on the mental components of task processing from discrete task trials. This approach neglects the advance planning of actions and overlapping of tasks characteristic of natural settings. The present research measures the joint timing of eye movements and manual responses in a typing-like task with the goal of extending models of discrete task performance to continuous domains. Following Pashler [1994] participants made separate choice responses to a series of five letters spread over a wide viewing area. Replicating Pashler's results, significant preview effects were found in both response time and eye movement data. Response to the first stimulus was delayed, with inter-response intervals for subsequent items rapid and flat across items. The eyes moved toward the next letter about 800 ms before the corresponding manual response (eye-hand span). Fixation dwell time was affected by stimulus luminance as well as difficulty of response mapping. The results suggest that fixation duration entails more than perceptual analyses. Implications of the results are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126529245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
The determinants of web page viewing behavior: an eye-tracking study 网页浏览行为的决定因素:一项眼球追踪研究
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968391
Bing Pan, H. Hembrooke, Geri Gay, Laura A. Granka, Matthew K. Feusner, Jill K. Newman
{"title":"The determinants of web page viewing behavior: an eye-tracking study","authors":"Bing Pan, H. Hembrooke, Geri Gay, Laura A. Granka, Matthew K. Feusner, Jill K. Newman","doi":"10.1145/968363.968391","DOIUrl":"https://doi.org/10.1145/968363.968391","url":null,"abstract":"The World Wide Web has become a ubiquitous information source and communication channel. With such an extensive user population, it is imperative to understand how web users view different web pages. Based on an eye tracking study of 30 subjects on 22 web pages from 11 popular web sites, this research intends to explore the determinants of ocular behavior on a single web page: whether it is determined by individual differences of the subjects, different types of web sites, the order of web pages being viewed, or the task at hand. The results indicate that gender of subjects, the viewing order of a web page, and the interaction between page order and site type influences online ocular behavior. Task instruction did not significantly affect web viewing behavior. Scanpath analysis revealed that the complexity of web page design influences the degree of scanpath variation among different subjects on the same web page. The contributions and limitations of this research, and future research directions are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"234 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124572422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 381
A gaze contingent environment for fostering social attention in autistic children 培养自闭症儿童社会注意力的注视偶然环境
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968367
R. Ramloll, C. Trepagnier, M. Sebrechts, A. Finkelmeyer
{"title":"A gaze contingent environment for fostering social attention in autistic children","authors":"R. Ramloll, C. Trepagnier, M. Sebrechts, A. Finkelmeyer","doi":"10.1145/968363.968367","DOIUrl":"https://doi.org/10.1145/968363.968367","url":null,"abstract":"This paper documents the engineering of a gaze contingent therapeutic environment for the exploration and validation of a proposed rehabilitative technique addressing attention deficits in 24 to 54 months old autistic subjects. It discusses the current state of progress and lessons learnt so far while highlighting the outstanding engineering challenges of this project. We focus on calibration issues for this target group of users, explain the architecture of the system and present our general workflow for the construction of the gaze contingent environment. While this work is being undertaken for therapeutic purposes, it is likely to be relevant to the construction of gaze contingent displays for entertainment.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115306599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 31
Focus of attention and pilot error 注意力集中和飞行员失误
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968377
E. Hanson
{"title":"Focus of attention and pilot error","authors":"E. Hanson","doi":"10.1145/968363.968377","DOIUrl":"https://doi.org/10.1145/968363.968377","url":null,"abstract":"The evolution of cockpit automation is associated with an increased criticality of human error because missing, ignoring, or incorrectly processing even the smallest bit of relevant information can lead to an aircraft incident or accident occurrence. The most important factors associated with such occurrences are focus of attention and pilot error. Research performed at the National Aerospace Laboratory (NLR) has shown that changes in focus of attention can be measured via an eye tracking system (ASL 4000SU). The aim of this paper is to discuss how eye movements are used to indicate focus of attention, and how such information can be used to design new cockpit displays with decreased chances of pilot error.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128295669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信