Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci
{"title":"动态场景中眼动追踪记录的自动比较","authors":"Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci","doi":"10.1109/EUVIP.2014.7018371","DOIUrl":null,"url":null,"abstract":"Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.","PeriodicalId":442246,"journal":{"name":"2014 5th European Workshop on Visual Information Processing (EUVIP)","volume":"166 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Towards automated comparison of eye-tracking recordings in dynamic scenes\",\"authors\":\"Thomas C. Kübler, Dennis R. Bukenberger, Judith Ungewiss, A. Worner, Colleen Rothe, U. Schiefer, W. Rosenstiel, Enkelejda Kasneci\",\"doi\":\"10.1109/EUVIP.2014.7018371\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.\",\"PeriodicalId\":442246,\"journal\":{\"name\":\"2014 5th European Workshop on Visual Information Processing (EUVIP)\",\"volume\":\"166 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 5th European Workshop on Visual Information Processing (EUVIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/EUVIP.2014.7018371\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 5th European Workshop on Visual Information Processing (EUVIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EUVIP.2014.7018371","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Towards automated comparison of eye-tracking recordings in dynamic scenes
Experiments involving eye-tracking usually require analysis of large data. While there is a rich landscape of tools to extract information about fixations and saccades from such data, the analysis at a higher level of abstraction (e.g., comparison of visual scanpaths between subjects) is still performed manually. Especially, the comparison of scanpaths derived from dynamic scenarios, where the observer is in permanent interaction with her environment, is highly challenging. In this work we (i) introduce a new work-flow for automated scanpath comparison in dynamic environments, which combines image processing, object tracking, and sequence comparison algorithms, and (ii) provide a new data set for performance evaluation of scanpath comparison methods that was extracted from eye-tracking data during an interactive tea-cooking task, referring to the experiments by Land et al. [1]. Furthermore, to showcase the applicability of our work-flow, we applied our method to the above data set to find differences in visual behavior between several runs for the tea-cooking task.