Eye Tracking Research & Application最新文献

筛选
英文 中文
Auramirror: reflections on attention aurammirror:对注意力的反射
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968385
Alexander W. Skaburskis, Roel Vertegaal, Jeffrey S. Shell
{"title":"Auramirror: reflections on attention","authors":"Alexander W. Skaburskis, Roel Vertegaal, Jeffrey S. Shell","doi":"10.1145/968363.968385","DOIUrl":"https://doi.org/10.1145/968363.968385","url":null,"abstract":"As ubiquitous computing becomes more prevalent, greater consideration will have to be taken on how devices interrupt us and vie for our attention. This paper describes Auramirror, an interactive art piece that raises questions of how computers use our attention. By measuring attention and visualizing the results for the audience in real-time, Auramirror brings the subject matter to the forefront of the audience's consideration. Finally, some ways of using the Auramirror system to help in the design of attention sensitive devices are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129731400","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Frequency analysis of task evoked pupillary response and eye-movement 任务诱发瞳孔反应和眼球运动的频率分析
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968381
M. Nakayama, Y. Shimizu
{"title":"Frequency analysis of task evoked pupillary response and eye-movement","authors":"M. Nakayama, Y. Shimizu","doi":"10.1145/968363.968381","DOIUrl":"https://doi.org/10.1145/968363.968381","url":null,"abstract":"This paper describes the influence of eye blinks on frequency analysis and power spectrum difference for task-evoked pupillography and eye-movement during an experiment which consisted of ocular following tasks and oral calculation tasks with three levels of task difficulty: control, 1×1,and 1×2 digit oral calculation.The compensation model for temporal pupil size based on MLP (multi layer perceptron) was trained to detect a blink and to estimate pupil size by using blinkless pupillary change and artificial blink patterns. The PSD (power spectrum density) measurements from the estimated pupillography during oral calculation tasks show significant differences, and the PSD increased with task difficulty in the area of 0.1 - 0.5Hz and 1.6 - 3.5Hz, as did the average pupil size.The eye-movement during blinks was corrected manually, to remove irregular eye-movements such as saccades. The CSD (cross spectrum density) was achieved from horizontal and vertical eye-movement coordinates. Significant differences in CSDs among experimental conditions were examined in the area of 0.6 - 1.5 Hz. These differences suggest that the task difficulty affects the relationship between horizontal and vertical eye-movement coordinates in the frequency domain.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115904676","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 50
Eye tracking system model with easy calibration 眼动追踪系统模型,易于校准
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968372
A. Villanueva, R. Cabeza, Sonia Porta
{"title":"Eye tracking system model with easy calibration","authors":"A. Villanueva, R. Cabeza, Sonia Porta","doi":"10.1145/968363.968372","DOIUrl":"https://doi.org/10.1145/968363.968372","url":null,"abstract":"Calibration is one of the most tedious and often annoying aspects of many eye tracking systems. It normally consists in looking at several marks on a screen in order to collect enough data to modify the parameters of an adjustable model. Unfortunately this step is unavoidable if a competent tracking system is desired. Many efforts have been made to achieve more competent and improved eye tracking systems. Maybe the search for an accurate mathematical model is one of the least researched fields. The lack of a parametric description of the gaze estimation problem makes it difficult to find the most suitable model, and therefore generic expressions in calibration and tracking sessions are employed instead. In other words, a model based on parameters describing the elements involved in the tracking system would provide a stronger basis and robustness. The aim of this work is to build up a mathematical model totally based in realistic variables describing elements taking part in an eye tracking system employing the well known bright pupil technique i.e. user, camera, illumination and screen. The model is said to be defined when the expression relating the point the user is looking at with the extracted features of the image (glint position and center of the pupil) is found. The desired model would have to be simple, realistic, accurate and easy to calibrate.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128899080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 29
Eye movements as reflections of perceptual and cognitive processes (abstract only) 眼动作为知觉和认知过程的反映(仅抽象)
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968365
K. Rayner
{"title":"Eye movements as reflections of perceptual and cognitive processes (abstract only)","authors":"K. Rayner","doi":"10.1145/968363.968365","DOIUrl":"https://doi.org/10.1145/968363.968365","url":null,"abstract":"Some historical issues regarding the use of eye movements to study cognitive processes will initially be discussed. The development of eye contingent display change experiments will be reviewed and examples will be presented regarding how the development of the technique provided answers to interesting questions. For the most part, examples will be taken from the psychology of reading, but other tasks will also be discussed. More recently, sophisticated models of eye movement control in the context of reading have been developed, and these models will be discussed. Some thoughts on future directions of eye movement research will also be presented.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"257 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121584552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Robust clustering of eye movement recordings for quantification of visual interest 用于量化视觉兴趣的眼动记录鲁棒聚类
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968368
A. Santella, D. DeCarlo
{"title":"Robust clustering of eye movement recordings for quantification of visual interest","authors":"A. Santella, D. DeCarlo","doi":"10.1145/968363.968368","DOIUrl":"https://doi.org/10.1145/968363.968368","url":null,"abstract":"Characterizing the location and extent of a viewer's interest, in terms of eye movement recordings, informs a range of investigations in image and scene viewing. We present an automatic data-driven method for accomplishing this, which clusters visual point-of-regard (POR) measurements into gazes and regions-of-interest using the mean shift procedure. Clusters produced using this method form a structured representation of viewer interest, and at the same time are replicable and not heavily influenced by noise or outliers. Thus, they are useful in answering fine-grained questions about where and how a viewer examined an image.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126988374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 191
A free-head, simple calibration, gaze tracking system that enables gaze-based interaction 一个自由的头,简单的校准,凝视跟踪系统,使基于凝视的互动
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968387
Takehiko Ohno, N. Mukawa
{"title":"A free-head, simple calibration, gaze tracking system that enables gaze-based interaction","authors":"Takehiko Ohno, N. Mukawa","doi":"10.1145/968363.968387","DOIUrl":"https://doi.org/10.1145/968363.968387","url":null,"abstract":"Human eye gaze is a strong candidate to create a new application area based on human-computer interaction. To implement a really practical gaze-based interaction system, gaze detection must be realized without placing any restriction on the user's behavior or comfort. This paper describes a gaze tracking system that offers freehead, simple personal calibration. It does not require the user wear anything on her head, and she can move her head freely. Personal calibration takes only a very short time; the user is asked to look at two markers on the screen. An experiment shows that the accuracy of the implemented system is about 1.0 degrees (view angle).","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"225 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131425634","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 184
An eye for an eye: a performance evaluation comparison of the LC technologies and Tobii eye trackers 以眼还眼:LC技术与Tobii眼动仪的性能评估比较
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968378
D. Cheng, Roel Vertegaal
{"title":"An eye for an eye: a performance evaluation comparison of the LC technologies and Tobii eye trackers","authors":"D. Cheng, Roel Vertegaal","doi":"10.1145/968363.968378","DOIUrl":"https://doi.org/10.1145/968363.968378","url":null,"abstract":"Permission to make digital or hard copies of part or all of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, to republish, to post on servers, or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from Permissions Dept, ACM Inc., fax +1 (212) 869-0481 or e-mail permissions@acm.org. © 2004 ACM 1-58113-825-3/04/0003 $5.00 An Eye for an Eye: A Performance Evaluation Comparison of the LC Technologies and Tobii Eye Trackers","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121692697","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
Gaze typing compared with input by head and hand 凝视输入与头和手输入的比较
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968389
J. P. Hansen, K. Tørning, A. Johansen, K. Itoh, Hirotaka Aoki
{"title":"Gaze typing compared with input by head and hand","authors":"J. P. Hansen, K. Tørning, A. Johansen, K. Itoh, Hirotaka Aoki","doi":"10.1145/968363.968389","DOIUrl":"https://doi.org/10.1145/968363.968389","url":null,"abstract":"This paper investigates the usability of gaze-typing systems for disabled people in a broad perspective that takes into account the usage scenarios and the particular users that these systems benefit. Design goals for a gaze-typing system are identified: productivity above 25 words per minute, robust tracking, high availability, and support of multimodal input. A detailed investigation of the efficiency and user satisfaction with a Danish and a Japanese gaze-typing system compares it to head- and mouse (hand) - typing. We found gaze typing to be more erroneous than the other two modalities. Gaze typing was just as fast as head typing, and both were slower than mouse (hand-) typing. Possibilities for design improvements are discussed.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130452125","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 143
Building a lightweight eyetracking headgear 打造一个轻量级的眼球追踪头盔
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968386
J. Babcock, J. Pelz
{"title":"Building a lightweight eyetracking headgear","authors":"J. Babcock, J. Pelz","doi":"10.1145/968363.968386","DOIUrl":"https://doi.org/10.1145/968363.968386","url":null,"abstract":"Eyetracking systems that use video-based cameras to monitor the eye and scene can be made significantly smaller thanks to tiny micro-lens video cameras. Pupil detection algorithms are generally implemented in hardware, allowing for real-time eyetracking. However, it is likely that real-time eyetracking will soon be fully accomplished in software alone. This paper encourages an \"open-source\" approach to eyetracking by providing practical tips on building a lightweight eyetracker from commercially available micro-lens cameras and other parts. While the headgear described here can be used with any dark-pupil eyetracking controller, it also opens the door to open-source software solutions that could be developed by the eyetracking and image-processing communities. Such systems could be optimized without concern for real-time performance because the systems could be run offline.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"3 Suppl 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125691184","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 166
Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment 眼睛注视模式区分新手和专家在虚拟腹腔镜手术训练环境
Eye Tracking Research & Application Pub Date : 2004-03-22 DOI: 10.1145/968363.968370
Benjamin Law, S. Fraser, Stella Atkins, A. Kirkpatrick, A. Lomax, C. MacKenzie
{"title":"Eye gaze patterns differentiate novice and experts in a virtual laparoscopic surgery training environment","authors":"Benjamin Law, S. Fraser, Stella Atkins, A. Kirkpatrick, A. Lomax, C. MacKenzie","doi":"10.1145/968363.968370","DOIUrl":"https://doi.org/10.1145/968363.968370","url":null,"abstract":"Visual information is important in surgeons' manipulative performance especially in laparoscopic surgery where tactual feedback is less than in open surgery. The study of surgeons' eye movements is an innovative way of assessing skill, in that a comparison of the eye movement strategies between expert surgeons and novices may show important differences that could be used in training. We conducted a preliminary study comparing the eye movements of 5 experts and 5 novices performing a one-handed aiming task on a computer-based laparoscopic surgery simulator. The performance results showed that experts were quicker and generally committed fewer errors than novices. We investigated eye movements as a possible factor for experts performing better than novices. The results from eye gaze analysis showed that novices needed more visual feedback of the tool position to complete the task than did experts. In addition, the experts tended to maintain eye gaze on the target while manipulating the tool, whereas novices were more varied in their behaviours. For example, we found that on some trials, novices tracked the movement of the tool until it reached the target.","PeriodicalId":127538,"journal":{"name":"Eye Tracking Research & Application","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2004-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116849001","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 251
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信