动态场景中眼动追踪记录的自动比较

Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci
{"title":"动态场景中眼动追踪记录的自动比较","authors":"Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci","doi":"10.1109/EUVIP.2014.7018371","DOIUrl":null,"url":null,"abstract":"Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.","PeriodicalId":442246,"journal":{"name":"2014 5th European Workshop on Visual Information Processing (EUVIP)","volume":"166 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Towards automated comparison of eye-tracking recordings in dynamic scenes\",\"authors\":\"Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci\",\"doi\":\"10.1109/EUVIP.2014.7018371\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.\",\"PeriodicalId\":442246,\"journal\":{\"name\":\"2014 5th European Workshop on Visual Information Processing (EUVIP)\",\"volume\":\"166 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 5th European Workshop on Visual Information Processing (EUVIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EUVIP.2014.7018371\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 5th European Workshop on Visual Information Processing (EUVIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EUVIP.2014.7018371","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

涉及眼球追踪的实验通常需要分析大量数据。虽然有丰富的工具可以从这些数据中提取关于注视和扫视的信息,但在更高抽象层次上的分析(例如,受试者之间视觉扫描路径的比较)仍然是手动执行的。特别是,来自动态场景的扫描路径的比较,其中观察者与她的环境处于永久的互动中,是非常具有挑战性的。在这项工作中,我们(i)引入了动态环境中自动扫描路径比较的新工作流程,它结合了图像处理、对象跟踪和序列比较算法;(ii)参考Land等人[1]的实验,为在交互式煮茶任务期间从眼动追踪数据中提取的扫描路径比较方法的性能评估提供了一个新的数据集。此外,为了展示我们的工作流程的适用性,我们将我们的方法应用于上述数据集,以发现几次运行之间的视觉行为差异。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Towards automated comparison of eye-tracking recordings in dynamic scenes
Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信