Proceedings. Eye Tracking Research & Applications Symposium最新文献

筛选
英文 中文
Skill Characterisation of Sonographer Gaze Patterns during Second Trimester Clinical Fetal Ultrasounds using Time Curves. 使用时间曲线的妊娠中期超声医师注视模式的技能特征。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2022-06-01 DOI: 10.1145/3517031.3529637
Clare Teng, Lok Hin Lee, Jayne Lander, Lior Drukker, Aris T Papageorghiou, Alison J Noble
{"title":"Skill Characterisation of Sonographer Gaze Patterns during Second Trimester Clinical Fetal Ultrasounds using Time Curves.","authors":"Clare Teng,&nbsp;Lok Hin Lee,&nbsp;Jayne Lander,&nbsp;Lior Drukker,&nbsp;Aris T Papageorghiou,&nbsp;Alison J Noble","doi":"10.1145/3517031.3529637","DOIUrl":"https://doi.org/10.1145/3517031.3529637","url":null,"abstract":"<p><p>We present a method for skill characterisation of sonographer gaze patterns while performing routine second trimester fetal anatomy ultrasound scans. The position and scale of fetal anatomical planes during each scan differ because of fetal position, movements and sonographer skill. A standardised reference is required to compare recorded eye-tracking data for skill characterisation. We propose using an affine transformer network to localise the anatomy circumference in video frames, for normalisation of eye-tracking data. We use an event-based data visualisation, time curves, to characterise sonographer scanning patterns. We chose brain and heart anatomical planes because they vary in levels of gaze complexity. Our results show that when sonographers search for the same anatomical plane, even though the landmarks visited are similar, their time curves display different visual patterns. Brain planes also, on average, have more events or landmarks occurring than the heart, which highlights anatomy-specific differences in searching approaches.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2022 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7614191/pdf/EMS159394.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9930156","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Visualising Spatio-Temporal Gaze Characteristics for Exploratory Data Analysis in Clinical Fetal Ultrasound Scans. 可视化时空凝视特征用于临床胎儿超声扫描的探索性数据分析。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2022-06-01 Epub Date: 2022-06-08 DOI: 10.1145/3517031.3529635
Clare Teng, Harshita Sharma, Lior Drukker, Aris T Papageorghiou, Alison J Noble
{"title":"Visualising Spatio-Temporal Gaze Characteristics for Exploratory Data Analysis in Clinical Fetal Ultrasound Scans.","authors":"Clare Teng, Harshita Sharma, Lior Drukker, Aris T Papageorghiou, Alison J Noble","doi":"10.1145/3517031.3529635","DOIUrl":"10.1145/3517031.3529635","url":null,"abstract":"<p><p>Visualising patterns in clinicians' eye movements while interpreting fetal ultrasound imaging videos is challenging. Across and within videos, there are differences in size an d position of Areas-of-Interest (AOIs) due to fetal position, movement and sonographer skill. Currently, AOIs are manually labelled or identified using eye-tracker manufacturer specifications which are not study specific. We propose using unsupervised clustering to identify meaningful AOIs and bi-contour plots to visualise spatio-temporal gaze characteristics. We use Hierarchical Density-Based Spatial Clustering of Applications with Noise (HDBSCAN) to identify the AOIs, and use their corresponding images to capture granular changes within each AOI. Then we visualise transitions within and between AOIs as read by the sonographer. We compare our method to a standardised eye-tracking manufacturer algorithm. Our method captures granular changes in gaze characteristics which are otherwise not shown. Our method is suitable for exploratory data analysis of eye-tracking data involving multiple participants and AOIs.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2022 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7614061/pdf/EMS159392.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9558055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Eye Tracking: Background, Methods, and Applications 眼动追踪:背景、方法和应用
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2022-01-01 DOI: 10.1007/978-1-0716-2391-6
{"title":"Eye Tracking: Background, Methods, and Applications","authors":"","doi":"10.1007/978-1-0716-2391-6","DOIUrl":"https://doi.org/10.1007/978-1-0716-2391-6","url":null,"abstract":"","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"23 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"84104055","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Post-processing integration and semi-automated analysis of eye-tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration. 对沉浸式虚拟现实环境中获得的眼动跟踪和动作捕捉数据进行后处理整合和半自动分析,以测量视觉运动整合。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2021-05-01 DOI: 10.1145/3450341.3458881
Haylie L Miller, Ian R Zurutuza, Nicholas E Fears, Suleyman O Polat, Rodney D Nielsen
{"title":"Post-processing integration and semi-automated analysis of eye-tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration.","authors":"Haylie L Miller, Ian R Zurutuza, Nicholas E Fears, Suleyman O Polat, Rodney D Nielsen","doi":"10.1145/3450341.3458881","DOIUrl":"10.1145/3450341.3458881","url":null,"abstract":"<p><p>Mobile eye-tracking and motion-capture techniques yield rich, precisely quantifiable data that can inform our understanding of the relationship between visual and motor processes during task performance. However, these systems are rarely used in combination, in part because of the significant time and human resources required for post-processing and analysis. Recent advances in computer vision have opened the door for more efficient processing and analysis solutions. We developed a post-processing pipeline to integrate mobile eye-tracking and full-body motion-capture data. These systems were used simultaneously to measure visuomotor integration in an immersive virtual environment. Our approach enables calculation of a 3D gaze vector that can be mapped to the participant's body position and objects in the virtual environment using a uniform coordinate system. This approach is generalizable to other configurations, and enables more efficient analysis of eye, head, and body movements together during visuomotor tasks administered in controlled, repeatable environments.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":" ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC8276594/pdf/nihms-1718937.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"39185504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Fixational stability as a measure for the recovery of visual function in amblyopia. 视固定稳定性作为弱视视功能恢复的一项指标。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2021-05-01 DOI: 10.1145/3450341.3458493
Avi M Aizenman, Dennis M Levi
{"title":"Fixational stability as a measure for the recovery of visual function in amblyopia.","authors":"Avi M Aizenman,&nbsp;Dennis M Levi","doi":"10.1145/3450341.3458493","DOIUrl":"https://doi.org/10.1145/3450341.3458493","url":null,"abstract":"<p><p>People with amblyopia have been shown to have decreased fixational stability, particularly those with strabismic amblyopia. Fixational stability and visual acuity have been shown to be tightly correlated across multiple studies, suggesting a relationship between acuity and oculomotor stability. Reduced visual acuity is the sine qua non of amblyopia, and recovery is measured by the improvement in visual acuity. Here we ask whether fixational stability can be used as an objective marker for the recovery of visual function in amblyopia. We tracked children's fixational stability during patching treatment over time and found fixational stability changes alongside improvements in visual acuity. This suggests fixational stability can be used as an objective measure for monitoring treatment in amblyopia and other disorders.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2021 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/3450341.3458493","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9136268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Positional head-eye tracking outside the lab: an open-source solution. 实验室外的位置头眼追踪:一个开源的解决方案。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2020-06-01 DOI: 10.1145/3379156.3391365
Peter Hausamann, Christian Sinnott, Paul R MacNeilage
{"title":"Positional head-eye tracking outside the lab: an open-source solution.","authors":"Peter Hausamann,&nbsp;Christian Sinnott,&nbsp;Paul R MacNeilage","doi":"10.1145/3379156.3391365","DOIUrl":"https://doi.org/10.1145/3379156.3391365","url":null,"abstract":"<p><p>Simultaneous head and eye tracking has traditionally been confined to a laboratory setting and real-world motion tracking limited to measuring linear acceleration and angular velocity. Recently available mobile devices such as the Pupil Core eye tracker and the Intel RealSense T265 motion tracker promise to deliver accurate measurements outside the lab. Here, the researchers propose a hard- and software framework that combines both devices into a robust, usable, low-cost head and eye tracking system. The developed software is open source and the required hardware modifications can be 3D printed. The researchers demonstrate the system's ability to measure head and eye movements in two tasks: an eyes-fixed head rotation task eliciting the vestibulo-ocular reflex inside the laboratory, and a natural locomotion task where a subject walks around a building outside of the laboratory. The resultant head and eye movements are discussed, as well as future implementations of this system.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2020 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/3379156.3391365","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25530532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-based Eye Trackers. GazeMetrics:一个用于测量基于hmd的眼动仪数据质量的开源工具。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2020-06-01 DOI: 10.1145/3379156.3391374
Isayas B Adhanom, Samantha C Lee, Eelke Folmer, Paul MacNeilage
{"title":"GazeMetrics: An Open-Source Tool for Measuring the Data Quality of HMD-based Eye Trackers.","authors":"Isayas B Adhanom,&nbsp;Samantha C Lee,&nbsp;Eelke Folmer,&nbsp;Paul MacNeilage","doi":"10.1145/3379156.3391374","DOIUrl":"https://doi.org/10.1145/3379156.3391374","url":null,"abstract":"As virtual reality (VR) garners more attention for eye tracking research, knowledge of accuracy and precision of head-mounted display (HMD) based eye trackers becomes increasingly necessary. It is tempting to rely on manufacturer-provided information about the accuracy and precision of an eye tracker. However, unless data is collected under ideal conditions, these values seldom align with on-site metrics. Therefore, best practices dictate that accuracy and precision should be measured and reported for each study. To address this issue, we provide a novel open-source suite for rigorously measuring accuracy and precision for use with a variety of HMD-based eye trackers. This tool is customizable without having to alter the source code, but changes to the code allow for further alteration. The outputs are available in real time and easy to interpret, making eye tracking with VR more approachable for all users.","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2020 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2020-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/3379156.3391374","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"25537137","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
CIDER: Enhancing the Performance of Computational Eyeglasses. CIDER:增强计算眼镜的性能。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2016-03-01 DOI: 10.1145/2857491.2884063
Addison Mayberry, Yamin Tun, Pan Hu, Duncan Smith-Freedman, Benjamin Marlin, Christopher Salthouse, Deepak Ganesan
{"title":"CIDER: Enhancing the Performance of Computational Eyeglasses.","authors":"Addison Mayberry,&nbsp;Yamin Tun,&nbsp;Pan Hu,&nbsp;Duncan Smith-Freedman,&nbsp;Benjamin Marlin,&nbsp;Christopher Salthouse,&nbsp;Deepak Ganesan","doi":"10.1145/2857491.2884063","DOIUrl":"https://doi.org/10.1145/2857491.2884063","url":null,"abstract":"<p><p>The human eye offers a fascinating window into an individual's health, cognitive attention, and decision making, but we lack the ability to continually measure these parameters in the natural environment. We demonstrate CIDER, a system that operates in a highly optimized low-power mode under indoor settings by using a fast Search-Refine controller to track the eye, but detects when the environment switches to more challenging outdoor sunlight and switches models to operate robustly under this condition. Our design is holistic and tackles a) power consumption in digitizing pixels, estimating pupillary parameters, and illuminating the eye via near-infrared and b) error in estimating pupil center and pupil dilation. We demonstrate that CIDER can estimate pupil center with error less than two pixels (0.6°), and pupil diameter with error of one pixel (0.22mm). Our end-to-end results show that we can operate at power levels of roughly 7mW at a 4Hz eye tracking rate, or roughly 32mW at rates upwards of 250Hz.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2016 ","pages":"313-314"},"PeriodicalIF":0.0,"publicationDate":"2016-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/2857491.2884063","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"35986484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
On Relationships Between Fixation Identification Algorithms and Fractal Box Counting Methods. 注视识别算法与分形盒计数方法的关系研究。
Proceedings. Eye Tracking Research & Applications Symposium Pub Date : 2014-03-01 DOI: 10.1145/2578153.2578161
Quan Wang, Elizabeth Kim, Katarzyna Chawarska, Brian Scassellati, Steven Zucker, Frederick Shic
{"title":"On Relationships Between Fixation Identification Algorithms and Fractal Box Counting Methods.","authors":"Quan Wang,&nbsp;Elizabeth Kim,&nbsp;Katarzyna Chawarska,&nbsp;Brian Scassellati,&nbsp;Steven Zucker,&nbsp;Frederick Shic","doi":"10.1145/2578153.2578161","DOIUrl":"https://doi.org/10.1145/2578153.2578161","url":null,"abstract":"<p><p>Fixation identification algorithms facilitate data comprehension and provide analytical convenience in eye-tracking analysis. However, current fixation algorithms for eye-tracking analysis are heavily dependent on parameter choices, leading to instabilities in results and incompleteness in reporting. This work examines the nature of human scanning patterns during complex scene viewing. We show that standard implementations of the commonly used distance-dispersion algorithm for fixation identification are functionally equivalent to greedy spatiotemporal tiling. We show that modeling the number of fixations as a function of tiling size leads to a measure of fractal dimensionality through box counting. We apply this technique to examine scale-free gaze behaviors in toddlers and adults looking at images of faces and blocks, as well as large number of adults looking at movies or static images. The distributional aspects of the number of fixations may suggest a fractal structure to gaze patterns in free scanning and imply that the incompleteness of standard algorithms may be due to the scale-free behaviors of the underlying scanning distributions. We discuss the nature of this hypothesis, its limitations, and offer directions for future work.</p>","PeriodicalId":74558,"journal":{"name":"Proceedings. Eye Tracking Research & Applications Symposium","volume":"2014 ","pages":"67-74"},"PeriodicalIF":0.0,"publicationDate":"2014-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://sci-hub-pdf.com/10.1145/2578153.2578161","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"34120805","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信