Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction最新文献

筛选
英文 中文
Eye tracking in naturalistic badminton play: comparing visual gaze pattern strategy in world-rank and amateur player 自然主义羽毛球运动中的眼动追踪:比较世界级与业余选手的视觉注视模式策略
Nithiya Shree Uppara, Aditi Mavalankar, K. Vemuri
{"title":"Eye tracking in naturalistic badminton play: comparing visual gaze pattern strategy in world-rank and amateur player","authors":"Nithiya Shree Uppara, Aditi Mavalankar, K. Vemuri","doi":"10.1145/3208031.3208037","DOIUrl":"https://doi.org/10.1145/3208031.3208037","url":null,"abstract":"A professional player's expertise rests on the ability to predict action by optimally extracting the opponent's postural cues. Eye tracking (head-mounted system) data in a naturalistic singles badminton play was collected from one professional world-ranked player facing five amateur players (10 serves or 50 trials) and two amateurs playing against four other amateur players each (10 serves or 80 trials). The visual gaze on the opponent body, segregated into 3 areas-of-interest covering the feet, face/torso, and hand/racket of the opponent and the shuttle, was analysed for a) the period just before the serve, b) while receiving the serve and c) the entire rally. The comparative analysis shows the first area-of-interest for professional player as the opponent's feet while executing the serve and the hand/racket when receiving a serve. On the other hand, the amateur players show no particular strategy of fixation location either for the serve task or while facing a serve. The average fixation duration (just before serve) for the professional was 0.96s and for the amateurs it was 1.48s. The findings highlight the differences in the postural cue considered important and the preparatory time in professional and amateur players. We believe, analytical models from dynamic gaze behavior in naturalistic game conditions as applied in this study can be used for enhancing perceptual-cognitive skills during training.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"98 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124086326","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere 无处不在的眼动追踪艺术:奥地利观景画廊的不受约束的眼动追踪
Thiago Santini, Hanna Brinkmann, Luise Reitstätter, H. Leder, R. Rosenberg, W. Rosenstiel, Enkelejda Kasneci
{"title":"The art of pervasive eye tracking: unconstrained eye tracking in the Austrian Gallery Belvedere","authors":"Thiago Santini, Hanna Brinkmann, Luise Reitstätter, H. Leder, R. Rosenberg, W. Rosenstiel, Enkelejda Kasneci","doi":"10.1145/3208031.3208032","DOIUrl":"https://doi.org/10.1145/3208031.3208032","url":null,"abstract":"Pervasive mobile eye tracking provides a rich data source to investigate human natural behavior, providing a high degree of ecological validity in natural environments. However, challenges and limitations intrinsic to unconstrained mobile eye tracking makes its development and usage to some extent an art. Nonetheless, researchers are pushing the boundaries of this technology to help assess museum visitors' attention not only between the exhibited works, but also within particular pieces, providing significantly more detailed insights than traditional timing-and-tracking or external observer approaches. In this paper, we present in detail the eye tracking system developed for a large scale fully-unconstrained study in the Austrian Gallery Belvedere, providing useful information for eye-tracking system designers. Furthermore, the study is described, and we report on usability and real-time performance metrics. Our results suggest that, although the system is comfortable enough, further eye tracker improvements are necessary to make it less conspicuous. Additionally, real-time accuracy already suffices for simple applications such as audio guides for the majority of users even in the absence of eye-tracker slippage compensation.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123977888","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 32
Crowdsourcing pupil annotation datasets: boundary vs. center, what performs better? 众包学生注释数据集:边界vs中心,哪个性能更好?
David Gil de Gómez Pérez, M. Suokas, R. Bednarik
{"title":"Crowdsourcing pupil annotation datasets: boundary vs. center, what performs better?","authors":"David Gil de Gómez Pérez, M. Suokas, R. Bednarik","doi":"10.1145/3208031.3208036","DOIUrl":"https://doi.org/10.1145/3208031.3208036","url":null,"abstract":"Pupil-related feature detection is one of the most common approaches used in the eye-tracking literature and practice. Validation and benchmarking of the detection algorithms relies on accurate ground-truth datasets, but creating of these is costly. Many approaches have been used to obtain human based annotations. A recent proposal to obtain these work-intensive data is through a crowdsourced registration of the pupil center, in which a large number of users provide a single click to indicate the pupil center [Gil de Gómez Pérez and Bednarik 2018a]. In this paper we compare the existing approach to a method based on multiple clicks on the boundary of the pupil region, in order to determine which approach provides better results. To compare both methods, a new data collection was performed over the same image database. Several metrics were applied in order to evaluate the accuracy of the two methods.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132796356","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction 第七届普适眼动追踪与移动眼动互动研讨会论文集
{"title":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","authors":"","doi":"10.1145/3208031","DOIUrl":"https://doi.org/10.1145/3208031","url":null,"abstract":"","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"186 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124728173","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Making stand-alone PS-OG technology tolerant to the equipment shifts 使独立的PS-OG技术能够适应设备的变化
R. Zemblys, Oleg V. Komogortsev
{"title":"Making stand-alone PS-OG technology tolerant to the equipment shifts","authors":"R. Zemblys, Oleg V. Komogortsev","doi":"10.1145/3208031.3208035","DOIUrl":"https://doi.org/10.1145/3208031.3208035","url":null,"abstract":"Tracking users' gaze in virtual reality headsets allows natural and intuitive interaction with virtual avatars and virtual objects. Moreover, a technique known as foveated rendering can help save computational resources and enable hi-resolution but lightweight virtual reality technologies. Predominantly, eye-tracking hardware in modern VR headsets consist of infrared camera(s) and LEDs. Such hardware, together with image processing software consumes a substantial amount of energy, and, provided that hi-speed gaze detection is needed, might be very expensive. A promising technique to overcome these issues is photo-sensor oculography (PS-OG), which allows eye-tracking with high sampling rate and low power consumption. However, the main limitation of the previous PS-OG systems is their inability to compensate for the equipment shifts. In this study, we employ a simple multi-layer perceptron neural network to map raw sensor data to gaze locations and report its performance for shift compensation. Modeling and evaluation is done via a simulation.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124251681","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Gaze-based interest detection on newspaper articles 基于注视的报纸文章兴趣检测
Soumy Jacob, S. S. Bukhari, Shoya Ishimaru, A. Dengel
{"title":"Gaze-based interest detection on newspaper articles","authors":"Soumy Jacob, S. S. Bukhari, Shoya Ishimaru, A. Dengel","doi":"10.1145/3208031.3208034","DOIUrl":"https://doi.org/10.1145/3208031.3208034","url":null,"abstract":"Eye tracking measures have been used to recognize cognitive states involving mental workload, comprehension, and self-confidence in the task of reading. In this paper, we present how these measures can be used to detect the interest of a reader. From the reading behavior of 13 university students on 18 newspaper articles, we have extracted features related to fixations, saccades, blinks and pupil diameters to detect which documents each participant finds interesting or uninteresting. We have classified their level of interests into four classes with an accuracy of 44% using eye movements, and it has increased to 62% if a survey about subjective comprehension is included. This research can be incorporated in the real-time prediction of a user's interest while reading, for the betterment of future designs of human-document interaction.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"260 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134137203","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Introducing I2head database 介绍I2head数据库
Ion Martinikorena, R. Cabeza, A. Villanueva, Sonia Porta
{"title":"Introducing I2head database","authors":"Ion Martinikorena, R. Cabeza, A. Villanueva, Sonia Porta","doi":"10.1145/3208031.3208033","DOIUrl":"https://doi.org/10.1145/3208031.3208033","url":null,"abstract":"I2Head database has been created with the aim to become an optimal reference for low cost gaze estimation. It exhibits the following outstanding characteristics: it takes into account key aspects of low resolution eye tracking technology; it combines images of users gazing at different grids of points from alternative positions with registers of user's head position and it provides calibration information of the camera and a simple 3D head model for each user. Hardware used to build the database includes a 6D magnetic sensor and a webcam. A careful calibration method between the sensor and the camera has been developed to guarantee the accuracy of the data. Different sessions have been recorded for each user including not only static head scenarios but also controlled displacements and even free head movements. The database is an outstanding framework to test both gaze estimation algorithms and head pose estimation methods.","PeriodicalId":212413,"journal":{"name":"Proceedings of the 7th Workshop on Pervasive Eye Tracking and Mobile Eye-Based Interaction","volume":"2319 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2018-06-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130343395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 14
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信