开发一种不受束缚的、可移动的、低成本的头戴式眼动仪

Elizabeth S. Kim, A. Naples, G. V. Gearty, Quan Wang, Seth Wallace, Carla A. Wall, Michael Perlmutter, F. Volkmar, F. Shic, L. Friedlaender, J. Kowitt, B. Reichow
{"title":"开发一种不受束缚的、可移动的、低成本的头戴式眼动仪","authors":"Elizabeth S. Kim, A. Naples, G. V. Gearty, Quan Wang, Seth Wallace, Carla A. Wall, Michael Perlmutter, F. Volkmar, F. Shic, L. Friedlaender, J. Kowitt, B. Reichow","doi":"10.1145/2578153.2578209","DOIUrl":null,"url":null,"abstract":"Head-mounted eye-tracking systems allow us to observe participants' gaze behaviors in largely unconstrained, real-world settings. We have developed novel, untethered, mobile, low-cost, lightweight, easily-assembled head-mounted eye-tracking devices, comprised entirely of off-the-shelf components, including untethered, point-of-view, sports cameras. In total, the parts we have used cost ~$153, and we suggest untested alternative components that reduce the cost of parts to ~$31. Our device can be easily assembled using hobbying skills and techniques. We have developed hardware, software, and methodological techniques to perform point-of-regard estimation, and to temporally align scene and eye videos in the face of variable frame rate, which plagues low-cost, lightweight, untethered cameras. We describe an innovative technique for synchronizing eye and scene videos using synchronized flashing lights. Our hardware, software, and calibration designs will be made publicly available, and we describe them in detail here, to facilitate replication of our system. We also describe novel smooth-pursuit-based calibration methodology, which affords rich sampling of calibration data while compensating for lack of information regarding the extent of visibility on participants' scene recordings. Validation experiments indicate accuracy within 0.752 degrees of visual angle on average.","PeriodicalId":142459,"journal":{"name":"Proceedings of the Symposium on Eye Tracking Research and Applications","volume":"96 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-03-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"Development of an untethered, mobile, low-cost head-mounted eye tracker\",\"authors\":\"Elizabeth S. Kim, A. Naples, G. V. Gearty, Quan Wang, Seth Wallace, Carla A. Wall, Michael Perlmutter, F. Volkmar, F. Shic, L. Friedlaender, J. Kowitt, B. Reichow\",\"doi\":\"10.1145/2578153.2578209\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Head-mounted eye-tracking systems allow us to observe participants' gaze behaviors in largely unconstrained, real-world settings. We have developed novel, untethered, mobile, low-cost, lightweight, easily-assembled head-mounted eye-tracking devices, comprised entirely of off-the-shelf components, including untethered, point-of-view, sports cameras. In total, the parts we have used cost ~$153, and we suggest untested alternative components that reduce the cost of parts to ~$31. Our device can be easily assembled using hobbying skills and techniques. We have developed hardware, software, and methodological techniques to perform point-of-regard estimation, and to temporally align scene and eye videos in the face of variable frame rate, which plagues low-cost, lightweight, untethered cameras. We describe an innovative technique for synchronizing eye and scene videos using synchronized flashing lights. Our hardware, software, and calibration designs will be made publicly available, and we describe them in detail here, to facilitate replication of our system. We also describe novel smooth-pursuit-based calibration methodology, which affords rich sampling of calibration data while compensating for lack of information regarding the extent of visibility on participants' scene recordings. Validation experiments indicate accuracy within 0.752 degrees of visual angle on average.\",\"PeriodicalId\":142459,\"journal\":{\"name\":\"Proceedings of the Symposium on Eye Tracking Research and Applications\",\"volume\":\"96 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-03-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the Symposium on Eye Tracking Research and Applications\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/2578153.2578209\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Symposium on Eye Tracking Research and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/2578153.2578209","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 12

摘要

头戴式眼球追踪系统允许我们在很大程度上不受约束的现实环境中观察参与者的凝视行为。我们已经开发出新颖的,不系绳的,移动的,低成本的,轻便的,易于组装的头戴式眼球追踪设备,完全由现成的组件组成,包括不系绳的,视角的,运动相机。总的来说,我们使用的部件成本约为153美元,我们建议使用未经测试的替代部件,将部件成本降低到31美元。我们的设备可以很容易地使用业余技能和技术组装。我们已经开发了硬件,软件和方法技术来执行关注点估计,并在面对可变帧率时暂时对齐场景和眼睛视频,这困扰着低成本,轻量级,无系绳相机。我们描述了一种使用同步闪光灯同步眼睛和场景视频的创新技术。我们的硬件,软件和校准设计将公开提供,我们在这里详细描述它们,以方便我们系统的复制。我们还描述了一种新颖的基于平滑追踪的校准方法,该方法提供了丰富的校准数据采样,同时补偿了缺乏关于参与者场景记录可见性程度的信息。验证实验表明,平均视觉角度精度在0.752度以内。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Development of an untethered, mobile, low-cost head-mounted eye tracker
Head-mounted eye-tracking systems allow us to observe participants' gaze behaviors in largely unconstrained, real-world settings. We have developed novel, untethered, mobile, low-cost, lightweight, easily-assembled head-mounted eye-tracking devices, comprised entirely of off-the-shelf components, including untethered, point-of-view, sports cameras. In total, the parts we have used cost ~$153, and we suggest untested alternative components that reduce the cost of parts to ~$31. Our device can be easily assembled using hobbying skills and techniques. We have developed hardware, software, and methodological techniques to perform point-of-regard estimation, and to temporally align scene and eye videos in the face of variable frame rate, which plagues low-cost, lightweight, untethered cameras. We describe an innovative technique for synchronizing eye and scene videos using synchronized flashing lights. Our hardware, software, and calibration designs will be made publicly available, and we describe them in detail here, to facilitate replication of our system. We also describe novel smooth-pursuit-based calibration methodology, which affords rich sampling of calibration data while compensating for lack of information regarding the extent of visibility on participants' scene recordings. Validation experiments indicate accuracy within 0.752 degrees of visual angle on average.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信