ACM Symposium on Applied Perception 2020最新文献

筛选
英文 中文
Far Distance Estimation in Mixed Reality 混合现实中的远距离估计
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407933
Holly C. Gagnon, Lauren E. Buck, Taylor N. Smith, G. Narasimham, Jeanine K. Stefanucci, Sarah H. Creem-Regehr, Bobby Bodenheimer
{"title":"Far Distance Estimation in Mixed Reality","authors":"Holly C. Gagnon, Lauren E. Buck, Taylor N. Smith, G. Narasimham, Jeanine K. Stefanucci, Sarah H. Creem-Regehr, Bobby Bodenheimer","doi":"10.1145/3385955.3407933","DOIUrl":"https://doi.org/10.1145/3385955.3407933","url":null,"abstract":"Mixed reality is increasingly being used for training, especially for tasks that are difficult for humans to perform such as far distance perception. Although there has been much prior work on the perception of distance in mixed reality, the majority of the work has focused on distances ranging up to about 30 meters. The little work that has investigated distance perception beyond 30 meters has been mixed in terms of whether people tend to over- or underestimate distances in this range. In the current study, we investigated the perception of distances ranging from 25-500 meters in a fully mediated outdoor environment using the ZED Mini and HTC Vive Pro. To our knowledge, no work has examined distance perception to this range in mixed reality. Participants in the study verbally estimated the distances to a variety of objects. Distances were overestimated from 25-200 meters but underestimated from 300-500 meters. Overall, far distance perception in mixed reality may be biased in different directions depending on the range of distances, suggesting a need for future research in this area to support training of far distance perception.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114716349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
The Interestingness of 3D Shapes 3D形状的趣味性
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407925
Manfred Lau, Luther Power
{"title":"The Interestingness of 3D Shapes","authors":"Manfred Lau, Luther Power","doi":"10.1145/3385955.3407925","DOIUrl":"https://doi.org/10.1145/3385955.3407925","url":null,"abstract":"The perception of shapes and images are important areas that have been explored in previous research. While the problem of the interestingness of images has been explored, there is no previous work in the interestingness of 3D shapes to the best of our knowledge. In this paper, we study this novel problem. We collect data on how humans perceive the interestingness of 3D shapes and analyze the data to gain insights on what makes a shape interesting. We show that a function can be learned to autonomously predict an interestingness score given a new shape and demonstrate some interestingness-guided 3D shape applications.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127633656","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Walk Ratio: Perception of an Invariant Parameter of Human Walk on Virtual Characters 行走比率:虚拟角色感知人类行走的一个不变参数
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407926
Benjamin Niay, A. Olivier, Katja Zibrek, J. Pettré, Ludovic Hoyet
{"title":"Walk Ratio: Perception of an Invariant Parameter of Human Walk on Virtual Characters","authors":"Benjamin Niay, A. Olivier, Katja Zibrek, J. Pettré, Ludovic Hoyet","doi":"10.1145/3385955.3407926","DOIUrl":"https://doi.org/10.1145/3385955.3407926","url":null,"abstract":"Synthesizing motions that look realistic and diverse is a challenging task in animation. Therefore, a few generic walking motions are typically used when creating crowds of walking virtual characters, leading to a lack of variations as motions are not necessarily adapted to each and every virtual character’s characteristics. While some attempts have been made to create variations, it appears necessary to identify the relevant parameters that influence users’ perception of such variations to keep a good trade-off between computational costs and realism. In this paper, we therefore investigate the ability of viewers to identify an invariant parameter of human walking named the Walk Ratio (step length to step frequency ratio), which was shown to be specific to each individual and constant for different speeds, but which has never been used to drive animations of virtual characters. To this end, we captured 2 female and 2 male actors walking at different freely chosen speeds, as well as at different combinations of step frequency and step length. We then performed a perceptual study to identify the Walk Ratio that was perceived as the most natural for each actor when animating a virtual character, and compared it to the Walk Ratio freely chosen by the actor during the motion capture session. We found that Walk Ratios chosen by observers were in the range of Walk Ratios measured in the literature, and that participants perceived differences between the Walk Ratios of animated male and female characters, as evidenced in the biomechanical literature. Our results provide new considerations to drive the animation of walking virtual characters using the Walk Ratio as a parameter, and might provide animators with novel means to control the walking speed of characters through simple parameters while retaining the naturalness of the locomotion.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126013145","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Evalu-light: A practical approach for evaluating character lighting in real-time 评估光:一个实用的方法来评估实时字符照明
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407937
Pisut Wisessing, R. Mcdonnell
{"title":"Evalu-light: A practical approach for evaluating character lighting in real-time","authors":"Pisut Wisessing, R. Mcdonnell","doi":"10.1145/3385955.3407937","DOIUrl":"https://doi.org/10.1145/3385955.3407937","url":null,"abstract":"The demand for cartoon animation content is on the rise, driven largely by social media, video conferencing and virtual reality. New content creation tools and the availability of open-source game engines allow characters to be animated by even novice creators (e.g., Loomie and Pinscreen for creating characters, Hyprface and MotionX for facial motion capture, etc,). However, lighting of animated characters is a skilled art-form, and there is a lack of guidelines for lighting characters expressing emotion in an appealing manner. Recent perceptual research has attempted to provide user guidelines through rigorous and labor-intensive rating-scale experiments exploring many parameters of light [Wisessing et al. 2016, 2020]. We propose an alternative approach based on the method of adjustment, similar to common lighting tools in 3D content creation software, but with the power of modern real-time graphics. Our framework allows users to interactively adjust lighting parameters and instantly assess the results on the animated characters, instead of having to wait for the render to complete.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116169464","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Novel Automotive Lamp Configurations: Computer-Based Assessment of Perceptual Efficiency 新型汽车灯配置:基于计算机的感知效率评估
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407936
P. Veto
{"title":"Novel Automotive Lamp Configurations: Computer-Based Assessment of Perceptual Efficiency","authors":"P. Veto","doi":"10.1145/3385955.3407936","DOIUrl":"https://doi.org/10.1145/3385955.3407936","url":null,"abstract":"Perceptual studies on pedestrians and bicyclists have shown that highlighting the most informative features of the road user increases visibility in real-life traffic scenarios. As technological limitations on automotive lamp configurations are decreasing, prototypes display novel layouts driven by aesthetic design. Could these advances also be used to aid perceptual processes? We tested this question in a computer-based visual search experiment. 3D rendered images of the frontal view of a motorbike and of a car with regular (REG) or contour enhancing (CON) headlamp configurations were compiled into scenes, simulating the crowding effect in night-time traffic. Perceptual performance in finding a target motorbike (present in 2/3 of all trials) among 28 cars of either condition was quantified through discriminability, reaction time, and eye movement measures. All measures showed a significant perceptual advantage in CON vs REG trials. Results suggest that facilitating object perception in traffic by highlighting relevant features of vehicles could increase visual performance in both speed and discriminability. Furthermore, the associated gain in eye movement efficiency may decrease fatigue. Similar methods, with varying trade-off ratios between fidelity and low-level control, could be applied to design the most ideal lamp configuration for traffic safety.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128448687","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Incorporating the Perception of Visual Roughness into the Design of Mid-Air Haptic Textures 将视觉粗糙感融入到半空触觉纹理设计中
ACM Symposium on Applied Perception 2020 Pub Date : 2020-09-12 DOI: 10.1145/3385955.3407927
David Beattie, W. Frier, Orestis Georgiou, Benjamin Long, Damien Ablart
{"title":"Incorporating the Perception of Visual Roughness into the Design of Mid-Air Haptic Textures","authors":"David Beattie, W. Frier, Orestis Georgiou, Benjamin Long, Damien Ablart","doi":"10.1145/3385955.3407927","DOIUrl":"https://doi.org/10.1145/3385955.3407927","url":null,"abstract":"Ultrasonic mid-air haptic feedback enables the tactile exploration of virtual objects in digital environments. However, an object’s shape and texture is perceived multimodally, commencing before tactile contact is made. Visual cues, such as the spatial distribution of surface elements, play a critical first step in forming an expectation of how a texture should feel. When rendering surface texture virtually, its verisimilitude is dependent on whether these visually inferred prior expectations are experienced during tactile exploration. To that end, our work proposes a method where the visual perception of roughness is integrated into the rendering algorithm of mid-air haptic texture feedback. We develop a machine learning model trained on crowd-sourced visual roughness ratings of texture images from the Penn Haptic Texture Toolkit (HaTT). We establish tactile roughness ratings for different mid-air haptic stimuli and match these ratings to our model’s output, creating an end-to-end automated visuo-haptic rendering algorithm. We validate our approach by conducting a user study to examine the utility of the mid-air haptic feedback. This work can be used to automatically create tactile virtual surfaces where the visual perception of texture roughness guides the design of mid-air haptic feedback.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-09-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129228690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
RIT-Eyes: Rendering of near-eye images for eye-tracking applications RIT-Eyes:眼动追踪应用的近眼图像渲染
ACM Symposium on Applied Perception 2020 Pub Date : 2020-06-05 DOI: 10.1145/3385955.3407935
Nitinraj Nair, Rakshit Kothari, A. Chaudhary, Zhizhuo Yang, Gabriel J. Diaz, J. Pelz, Reynold J. Bailey
{"title":"RIT-Eyes: Rendering of near-eye images for eye-tracking applications","authors":"Nitinraj Nair, Rakshit Kothari, A. Chaudhary, Zhizhuo Yang, Gabriel J. Diaz, J. Pelz, Reynold J. Bailey","doi":"10.1145/3385955.3407935","DOIUrl":"https://doi.org/10.1145/3385955.3407935","url":null,"abstract":"Deep neural networks for video-based eye tracking have demonstrated resilience to noisy environments, stray reflections, and low resolution. However, to train these networks, a large number of manually annotated images are required. To alleviate the cumbersome process of manual labeling, computer graphics rendering is employed to automatically generate a large corpus of annotated eye images under various conditions. In this work, we introduce a synthetic eye image generation platform that improves upon previous work by adding features such as an active deformable iris, an aspherical cornea, retinal retro-reflection, gaze-coordinated eye-lid deformations, and blinks. To demonstrate the utility of our platform, we render images reflecting the represented gaze distributions inherent in two publicly available datasets, NVGaze and OpenEDS. We also report on the performance of two semantic segmentation architectures (SegNet and RITnet) trained on rendered images and tested on the original datasets.","PeriodicalId":434621,"journal":{"name":"ACM Symposium on Applied Perception 2020","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125389509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信