S. Grogorick, Georgia Albuquerque, J. Tauscher, Marc Kassubeck, M. Magnor
{"title":"Towards VR Attention Guidance: Environment-dependent Perceptual Threshold for Stereo Inverse Brightness Modulation","authors":"S. Grogorick, Georgia Albuquerque, J. Tauscher, Marc Kassubeck, M. Magnor","doi":"10.1145/3343036.3343137","DOIUrl":"https://doi.org/10.1145/3343036.3343137","url":null,"abstract":"In this paper, we propose a new method for attention and gaze redirection, specifically designed for immersive stereo displays. Exploiting the dual nature of stereo imagery, our stimulus is composed of complementary parts displayed for each individual eye. This attracts viewers’ attention due to induced binocular rivalry. In a perceptual study, we investigate size- and intensity-related perceptual thresholds of our stimulus for six different real-world panorama images. Our results show that a flexible parameterization allows the stimulus to be perceived even in complex surroundings. To prepare for technical innovations expected in future-generation virtual reality headsets, we used a commercially available head-mounted display as well as a high-resolution dps.","PeriodicalId":228010,"journal":{"name":"ACM Symposium on Applied Perception 2019","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130323437","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Liang Zhou, Rudolf Netzel, D. Weiskopf, Chris R. Johnson
{"title":"Spectral Visualization Sharpening","authors":"Liang Zhou, Rudolf Netzel, D. Weiskopf, Chris R. Johnson","doi":"10.1145/3343036.3343133","DOIUrl":"https://doi.org/10.1145/3343036.3343133","url":null,"abstract":"In this paper, we propose a perceptually-guided visualization sharpening technique. We analyze the spectral behavior of an established comprehensive perceptual model to arrive at our approximated model based on an adapted weighting of the bandpass images from a Gaussian pyramid. The main benefit of this approximated model is its controllability and predictability for sharpening color-mapped visualizations. Our method can be integrated into any visualization tool as it adopts generic image-based post-processing, and it is intuitive and easy to use as viewing distance is the only parameter. Using highly diverse datasets, we show the usefulness of our method across a wide range of typical visualizations.","PeriodicalId":228010,"journal":{"name":"ACM Symposium on Applied Perception 2019","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127981832","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tanmay Randhavane, Aniket Bera, Kyra Kapsaskis, R. Sheth, Kurt Gray, Dinesh Manocha
{"title":"EVA: Generating Emotional Behavior of Virtual Agents using Expressive Features of Gait and Gaze","authors":"Tanmay Randhavane, Aniket Bera, Kyra Kapsaskis, R. Sheth, Kurt Gray, Dinesh Manocha","doi":"10.1145/3343036.3343129","DOIUrl":"https://doi.org/10.1145/3343036.3343129","url":null,"abstract":"We present a novel, real-time algorithm, EVA, for generating virtual agents with various perceived emotions. Our approach is based on using Expressive Features of gaze and gait to convey emotions corresponding to happy, sad, angry, or neutral. We precompute a data-driven mapping between gaits and their perceived emotions. EVA uses this gait emotion association at runtime to generate appropriate walking styles in terms of gaits and gaze. Using the EVA algorithm, we can simulate gaits and gazing behaviors of hundreds of virtual agents in real-time with known emotional characteristics. We have evaluated the benefits in different multi-agent VR simulation environments. Our studies suggest that the use of expressive features corresponding to gait and gaze can considerably increase the sense of presence in scenarios with multiple virtual agents.","PeriodicalId":228010,"journal":{"name":"ACM Symposium on Applied Perception 2019","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-07-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128581303","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}