Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications最新文献

筛选
英文 中文
Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviour 利用注视和说话行为在自然多人互动中进行稳健的目光接触检测
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204549
P. Müller, Michael Xuelin Huang, Xucong Zhang, A. Bulling
{"title":"Robust eye contact detection in natural multi-person interactions using gaze and speaking behaviour","authors":"P. Müller, Michael Xuelin Huang, Xucong Zhang, A. Bulling","doi":"10.1145/3204493.3204549","DOIUrl":"https://doi.org/10.1145/3204493.3204549","url":null,"abstract":"Eye contact is one of the most important non-verbal social cues and fundamental to human interactions. However, detecting eye contact without specialised eye tracking equipment poses significant challenges, particularly for multiple people in real-world settings. We present a novel method to robustly detect eye contact in natural three- and four-person interactions using off-the-shelf ambient cameras. Our method exploits that, during conversations, people tend to look at the person who is currently speaking. Harnessing the correlation between people's gaze and speaking behaviour therefore allows our method to automatically acquire training data during deployment and adaptively train eye contact detectors for each target user. We empirically evaluate the performance of our method on a recent dataset of natural group interactions and demonstrate that it achieves a relative improvement over the state-of-the-art method of more than 60%, and also improves over a head pose based baseline.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115850071","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 39
Implicit user calibration for gaze-tracking systems using an averaged saliency map around the optical axis of the eye 使用眼睛光轴周围的平均显著性图的注视跟踪系统的隐式用户校准
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204572
Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu
{"title":"Implicit user calibration for gaze-tracking systems using an averaged saliency map around the optical axis of the eye","authors":"Mamoru Hiroe, Michiya Yamamoto, Takashi Nagamatsu","doi":"10.1145/3204493.3204572","DOIUrl":"https://doi.org/10.1145/3204493.3204572","url":null,"abstract":"A 3D gaze-tracking method that uses two cameras and two light sources can measure the optical axis of the eye without user calibration. The visual axis of the eye (line of sight) is estimated by conducting a single-point user calibration. This single-point user calibration estimates the angle k that is offset between the optical and visual axes of the eye, which is a user-dependent parameter. We have proposed an implicit user calibration method for gaze-tracking systems using a saliency map around the optical axis of the eye. We assume that the peak of the average of the saliency maps indicates the visual axis of the eye in the eye coordinate system. We used both-eye restrictions effectively. The experimental result shows that the proposed system could estimate angle k without explicit personal calibration.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129395377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
New features of scangraph: a tool for revealing participants' strategy from eye-movement data 扫描仪的新功能:一种从眼球运动数据中揭示参与者策略的工具
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208334
S. Popelka, J. Dolezalová, Marketa Beitlova
{"title":"New features of scangraph: a tool for revealing participants' strategy from eye-movement data","authors":"S. Popelka, J. Dolezalová, Marketa Beitlova","doi":"10.1145/3204493.3208334","DOIUrl":"https://doi.org/10.1145/3204493.3208334","url":null,"abstract":"The demo describes new features of ScanGraph, an application intended for a finding of participants with a similar stimulus reading strategy based on the sequences of visited Areas of Interest. The result is visualised using cliques of a simple graph. ScanGraph was initially introduced in 2016. Since the original publication, new features were added. First of them is the implementation of Damerau-Levenshtein algorithm for similarity calculation. A heuristic algorithm for cliques finding used in the original version was replaced by the Bron-Kerbosch algorithm. ScanGraph reads data from open-source application OGAMA, and with the use of conversion tool also data from SMI BeGaze, which allows analysing dynamic stimuli as well. The most prominent enhancement is the possibility of similarity calculation among participants not only for a single stimulus but for multiple files at once.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115254104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Capturing real-world gaze behaviour: live and unplugged 捕捉真实世界的凝视行为:实时和不插电
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204528
Karishma Singh, Mahmoud Kalash, Neil D. B. Bruce
{"title":"Capturing real-world gaze behaviour: live and unplugged","authors":"Karishma Singh, Mahmoud Kalash, Neil D. B. Bruce","doi":"10.1145/3204493.3204528","DOIUrl":"https://doi.org/10.1145/3204493.3204528","url":null,"abstract":"Understanding human gaze behaviour has benefits from scientific understanding to many application domains. Current practices constrain possible use cases, requiring experimentation restricted to a lab setting or controlled environment. In this paper, we demonstrate a flexible unconstrained end-to-end solution that allows for collection and analysis of gaze data in real-world settings. To achieve these objectives, rich 3D models of the real world are derived along with strategies for associating experimental eye-tracking data with these models. In particular, we demonstrate the strength of photogrammetry in allowing these capabilities to be realized, and demonstrate the first complete solution for 3D gaze analysis in large-scale outdoor environments using standard camera technology without fiducial markers. The paper also presents techniques for quantitative analysis and visualization of 3D gaze data. As a whole, the body of techniques presented provides a foundation for future research, with new opportunities for experimental studies and computational modeling efforts.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125756811","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Evaluating gender difference on algorithmic problems using eye-tracker 用眼动仪评估算法问题的性别差异
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204537
U. Obaidellah, Mohammed Al Haek
{"title":"Evaluating gender difference on algorithmic problems using eye-tracker","authors":"U. Obaidellah, Mohammed Al Haek","doi":"10.1145/3204493.3204537","DOIUrl":"https://doi.org/10.1145/3204493.3204537","url":null,"abstract":"Gender differences in programming comprehension has been a topic of discussion in recent years. We conducted an eye-tracking study on 51(21 female, 30 male) computer science undergraduate university students to examine their cognitive processes in pseudocode comprehension. We aim to identify their reading strategies and eye gaze behavior on the comprehension of pseudocodes in terms of performance and visual effort when solving algorithmic problems of varying difficulty levels. Each student completed a series of tasks requiring them to rearrange randomized pseudocode statements in a correct order for the problem presented. Our results indicated that the speed of analyzing the problems were faster among male students, although female students fixated longer in understanding the problem requirements. In addition, female students more commonly fixated on indicative verbs (i.e., prompt, print), while male students fixated more on operational statements (i.e., loops, variables calculations, file handling).","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"14 8","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113932760","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Eyemic Eyemic
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208342
Shaharam Eivazi, Maximilian Maurer
{"title":"Eyemic","authors":"Shaharam Eivazi, Maximilian Maurer","doi":"10.1145/3204493.3208342","DOIUrl":"https://doi.org/10.1145/3204493.3208342","url":null,"abstract":"The concept of hands free surgical microscope has become increasingly popular in the domain of microsurgery. The higher magnification, the smaller field of view, necessitates frequent interaction with the microscope during an operation. Researchers showed that manual (hand) interactions with a surgical microscope resulted in disruptive and hazardous situations. Previously, we proposed the idea of eye control microscope as a solution to this interaction problem. While gaze contingent applications have been widely studied in HCI and eye tracking domain the lack of ocular based eye trackers for microscope being an important concern in this domain. To solve this critical problem and provide opportunity to capture eye movements in microsurgery in real time we present EyeMic, a binocular eye tracker that can be attached on top of any microscope ocular. Our eye tracker has only 5mm height to grantee same field of view, and it supports up to 120 frame per second eye movement recording.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"78 26","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113944410","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Developing photo-sensor oculography (PS-OG) system for virtual reality headsets 开发用于虚拟现实头戴式设备的光感视觉(PS-OG)系统
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3208341
R. Zemblys, Oleg V. Komogortsev
{"title":"Developing photo-sensor oculography (PS-OG) system for virtual reality headsets","authors":"R. Zemblys, Oleg V. Komogortsev","doi":"10.1145/3204493.3208341","DOIUrl":"https://doi.org/10.1145/3204493.3208341","url":null,"abstract":"Virtual reality (VR) is employed in a variety of different applications. It is our belief that eye-tracking is going to be a part of the majority of VR devices that will reduce computational burden via a technique called foveated rendering and will increase the immersion of the VR environment. A promising technique to achieve low energy, fast, and accurate eye-tracking is photo-sensor oculography (PS-OG). PS-OG technology enables tracking a user's gaze location at very fast rates - 1000Hz or more, and is expected to consume several orders of magnitude less power compared to a traditional video-oculography approach. In this demo we present a prototype of a PS-OG system that we started to develop. The long-term aim of our project is to develop a PS-OG system that is robust to sensor shifts. As a first step we have built a prototype that allows us to test different sensors and their configurations, as well as record and analyze eye-movement data.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129866247","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Enhanced representation of web pages for usability analysis with eye tracking 通过眼动追踪增强网页可用性分析的表示
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204535
Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab
{"title":"Enhanced representation of web pages for usability analysis with eye tracking","authors":"Raphael Menges, Hanadi Tamimi, C. Kumar, T. Walber, Christoph Schaefer, Steffen Staab","doi":"10.1145/3204493.3204535","DOIUrl":"https://doi.org/10.1145/3204493.3204535","url":null,"abstract":"Eye tracking as a tool to quantify user attention plays a major role in research and application design. For Web page usability, it has become a prominent measure to assess which sections of a Web page are read, glanced or skipped. Such assessments primarily depend on the mapping of gaze data to a Web page representation. However, current representation methods, a virtual screenshot of the Web page or a video recording of the complete interaction session, suffer either from accuracy or scalability issues. We present a method that identifies fixed elements on Web pages and combines user viewport screenshots in relation to fixed elements for an enhanced representation of the page. We conducted an experiment with 10 participants and the results signify that analysis with our method is more efficient than a video recording, which is an essential criterion for large scale Web studies.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"395 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131948126","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
I see what you see: gaze awareness in mobile video collaboration 我看到了你所看到的:移动视频协作中的凝视意识
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3204542
Deepak Akkil, Biju Thankachan, Poika Isokoski
{"title":"I see what you see: gaze awareness in mobile video collaboration","authors":"Deepak Akkil, Biju Thankachan, Poika Isokoski","doi":"10.1145/3204493.3204542","DOIUrl":"https://doi.org/10.1145/3204493.3204542","url":null,"abstract":"An emerging use of mobile video telephony is to enable joint activities and collaboration on physical tasks. We conducted a controlled user study to understand if seeing the gaze of a remote instructor is beneficial for mobile video collaboration and if it is valuable that the instructor is aware of sharing of the gaze. We compared three gaze sharing configurations, (a) Gaze_Visible where the instructor is aware and can view own gaze point that is being shared, (b) Gaze_Invisible where the instructor is aware of the shared gaze but cannot view her own gaze point and (c) Gaze_Unaware where the instructor is unaware about the gaze sharing, with a baseline of shared-mouse pointer. Our results suggests that naturally occurring gaze may not be as useful as explicitly produced eye movements. Further, instructors prefer using mouse rather than gaze for remote gesturing, while the workers also find value in transferring the gaze information.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"27 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133719246","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Towards concise gaze sharing 走向简洁的凝视分享
Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications Pub Date : 2018-06-14 DOI: 10.1145/3204493.3207416
C. Schlösser
{"title":"Towards concise gaze sharing","authors":"C. Schlösser","doi":"10.1145/3204493.3207416","DOIUrl":"https://doi.org/10.1145/3204493.3207416","url":null,"abstract":"Computer-supported collaboration changed the way we learn and work together, as co-location is no longer a necessity. While presence, pointing and actions belong to the established inventory of awareness functionality which aims to inform about peer activities, visual attention as a beneficial cue for successful collaboration does not. Several studies have shown that providing real-time gaze cues is advantageous, as it enables more efficient referencing by reducing deictic expressions and fosters joint attention by facilitating shared gaze. But the actual use is held back due to its inherent limitations: Real-time gaze display is often considered distracting, which is caused by its constant movement and an overall low signal-to-noise ratio. As a result, the transient nature makes it difficult to associate with a dynamic stimulus over time. While it is helpful when referencing or shared gaze is crucial, the application in common collaborative environments with a constant alternation between close and loose collaboration presents challenges. My dissertation work will explore a novelty gaze sharing approach, that aims to detect task-related gaze patterns which are displayed in concise representations. This work will contribute to our understanding of coordination in collaborative environments and propose algorithms and design recommendations for gaze sharing.","PeriodicalId":237808,"journal":{"name":"Proceedings of the 2018 ACM Symposium on Eye Tracking Research & Applications","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114865441","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信