现实世界的条件反射注视-偶然系统:来自快餐行业实地研究的见解

Melanie Heck, Janick Edinger, Christian Becker
{"title":"现实世界的条件反射注视-偶然系统:来自快餐行业实地研究的见解","authors":"Melanie Heck, Janick Edinger, Christian Becker","doi":"10.1145/3411763.3451658","DOIUrl":null,"url":null,"abstract":"Eye tracking can be used to infer what is relevant to a user, and adapt the content and appearance of an application to support the user in their current task. A prerequisite for integrating such adaptive user interfaces into public terminals is robust gaze estimation. Commercial eye trackers are highly accurate, but require prior person-specific calibration and a relatively stable head position. In this paper, we collect data from 26 authentic customers of a fast food restaurant while interacting with a total of 120 products on a self-order terminal. From our observations during the experiment and a qualitative analysis of the collected gaze data, we derive best practice approaches regarding the integration of eye tracking software into self-service systems. We evaluate several implicit calibration strategies that derive the user’s true focus of attention either from the context of the user interface, or from their interaction with the system. Our results show that the original gaze estimates can be visibly improved by taking into account both contextual and interaction-based information.","PeriodicalId":265192,"journal":{"name":"Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Conditioning Gaze-Contingent Systems for the Real World: Insights from a Field Study in the Fast Food Industry\",\"authors\":\"Melanie Heck, Janick Edinger, Christian Becker\",\"doi\":\"10.1145/3411763.3451658\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Eye tracking can be used to infer what is relevant to a user, and adapt the content and appearance of an application to support the user in their current task. A prerequisite for integrating such adaptive user interfaces into public terminals is robust gaze estimation. Commercial eye trackers are highly accurate, but require prior person-specific calibration and a relatively stable head position. In this paper, we collect data from 26 authentic customers of a fast food restaurant while interacting with a total of 120 products on a self-order terminal. From our observations during the experiment and a qualitative analysis of the collected gaze data, we derive best practice approaches regarding the integration of eye tracking software into self-service systems. We evaluate several implicit calibration strategies that derive the user’s true focus of attention either from the context of the user interface, or from their interaction with the system. Our results show that the original gaze estimates can be visibly improved by taking into account both contextual and interaction-based information.\",\"PeriodicalId\":265192,\"journal\":{\"name\":\"Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-05-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3411763.3451658\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Extended Abstracts of the 2021 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3411763.3451658","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

眼动追踪可以用来推断与用户相关的内容,并调整应用程序的内容和外观,以支持用户当前的任务。将这种自适应用户界面集成到公共终端的先决条件是鲁棒的注视估计。商用眼动仪精度很高,但需要事先对个人进行校准,并且头部位置相对稳定。在本文中,我们收集了一家快餐店的26位真实顾客的数据,同时在自助点餐终端上与总共120种产品进行交互。根据我们在实验中的观察和对收集到的凝视数据的定性分析,我们得出了将眼动追踪软件集成到自助服务系统中的最佳实践方法。我们评估了几种隐含的校准策略,这些策略可以从用户界面的上下文或用户与系统的交互中获得用户真正的关注焦点。我们的研究结果表明,通过考虑上下文和基于交互的信息,可以明显改善原始的凝视估计。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Conditioning Gaze-Contingent Systems for the Real World: Insights from a Field Study in the Fast Food Industry
Eye tracking can be used to infer what is relevant to a user, and adapt the content and appearance of an application to support the user in their current task. A prerequisite for integrating such adaptive user interfaces into public terminals is robust gaze estimation. Commercial eye trackers are highly accurate, but require prior person-specific calibration and a relatively stable head position. In this paper, we collect data from 26 authentic customers of a fast food restaurant while interacting with a total of 120 products on a self-order terminal. From our observations during the experiment and a qualitative analysis of the collected gaze data, we derive best practice approaches regarding the integration of eye tracking software into self-service systems. We evaluate several implicit calibration strategies that derive the user’s true focus of attention either from the context of the user interface, or from their interaction with the system. Our results show that the original gaze estimates can be visibly improved by taking into account both contextual and interaction-based information.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信