{"title":"AnyLight: an integral illumination device","authors":"Yuichiro Takeuchi, Shunichi Suwa, Kunihiko Nagamine","doi":"10.1145/2929464.2929480","DOIUrl":"https://doi.org/10.1145/2929464.2929480","url":null,"abstract":"We introduce AnyLight, a novel programmable lighting device that can mimic the illumination effects of a broad range of light sources ---both real and imagined---using the principle of integral imaging. The flat, panel-shaped device functions in essence as a type of light field display, relying on custom, 3D printed optics to precisely control light rays emanating from each point on its surface, simulating the existence of arbitrary light sources concealed within the device, e.g., spotlight, candle, skylight, etc. A room illuminated with AnyLight would allow occupants to manipulate ambient lighting with a degree of freedom unreachable using existing programmable lighting setups, where typically color is the only adjustable parameter.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125823012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Takatoshi Yoshida, Yoshihiro Watanabe, M. Ishikawa
{"title":"Phyxel: realistic display using physical objects with high-speed spatially pixelated lighting","authors":"Takatoshi Yoshida, Yoshihiro Watanabe, M. Ishikawa","doi":"10.1145/2929464.2929476","DOIUrl":"https://doi.org/10.1145/2929464.2929476","url":null,"abstract":"Phyxel is a realistic display that makes a desired physical object appear at spatially pixelated locations. The created image appears to be essentially real and can be easily manipulated, like a virtual image. A promising approach to realizing this display can be found in some aspects of the Zoetrope or in Fukushima's work [Fukushima et al. 2015]. Toward the realization of Phyxel, it is essential to closely coordinate the lighting and motion for the perceptual reality. In the developed system, we manipulate the motion of various objects at high speed and control their perceived locations by projecting a computed lighting pattern using a 1000-fps 8-bit high-speed projector [Watanabe et al. 2015].","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132854998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anjul Patney, Joohwan Kim, Marco Salvi, Anton Kaplanyan, Chris Wyman, Nir Benty, A. Lefohn, D. Luebke
{"title":"Perceptually-based foveated virtual reality","authors":"Anjul Patney, Joohwan Kim, Marco Salvi, Anton Kaplanyan, Chris Wyman, Nir Benty, A. Lefohn, D. Luebke","doi":"10.1145/2929464.2929472","DOIUrl":"https://doi.org/10.1145/2929464.2929472","url":null,"abstract":"Humans have two distinct vision systems: foveal and peripheral vision. Foveal vision is sharp and detailed, while peripheral vision lacks fidelity. The difference in characteristics of the two systems enable recently popular foveated rendering systems, which seek to increase rendering performance by lowering image quality in the periphery. We present a set of perceptually-based methods for improving foveated rendering running on a prototype virtual reality headset with an integrated eye tracker. Foveated rendering has previously been demonstrated in conventional displays, but has recently become an especially attractive prospect in virtual reality (VR) and augmented reality (AR) display settings with a large field-of-view (FOV) and high frame rate requirements. Investigating prior work on foveated rendering, we find that some previous quality-reduction techniques can create objectionable artifacts like temporal instability and contrast loss. Our emerging technologies installation demonstrates these techniques running live in a head-mounted display and we will compare them against our new perceptually-based foveated techniques. Our new foveation techniques enable significant reduction in rendering cost but have no discernible difference in visual quality. We show how such techniques can fulfill these requirements with potentially large reductions in rendering cost.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117051733","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Ogawa, K. Tanabe, Vibol Yem, Taku Hachisu, H. Kajimoto
{"title":"HapTONE: haptic instrument for enriched musical play","authors":"D. Ogawa, K. Tanabe, Vibol Yem, Taku Hachisu, H. Kajimoto","doi":"10.1145/2929464.2929477","DOIUrl":"https://doi.org/10.1145/2929464.2929477","url":null,"abstract":"This paper describes a novel music entertainment system that draws on auditory, tactile and visual senses. HapTONE presents players with high-fidelity vibrotactile sensations, not only after pressing the keyboard but also during the pressing operation itself. We developed keyboard type instrument that composed of key unit which is structured a vibrator and a distance sensor. This instrument reproduces the touch sensation of a keyboard, stringed, wind, percussion or non-musical instrument. We describe three applications of HapTONE that include: 1) the accurate replication of percussion instruments; 2) playing of pseudo-stringed instruments, and 3) synchronized vibration with animation. HapTONE is a musical entertainment system for players themselves using auditory, tactile and visual senses.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"81 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121686804","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Robert Konrad, Nitish Padmanaban, Emily B. Cooper, Gordon Wetzstein
{"title":"Computational focus-tunable near-eye displays","authors":"Robert Konrad, Nitish Padmanaban, Emily B. Cooper, Gordon Wetzstein","doi":"10.1145/2929464.2929470","DOIUrl":"https://doi.org/10.1145/2929464.2929470","url":null,"abstract":"Immersive virtual and augmented reality systems (VR/AR) are entering the consumer market and have the potential to profoundly impact our society. Applications of these systems range from communication, entertainment, education, collaborative work, simulation and training to telesurgery, phobia treatment, and basic vision research. In every immersive experience, the primary interface between the user and the digital world is the near-eye display. Thus, developing near-eye display systems that provide a high-quality user experience is of the utmost importance. Many characteristics of near-eye displays that define the quality of an experience, such as resolution, refresh rate, contrast, and field of view, have been significantly improved in recent years. However, a significant source of visual discomfort prevails: the vergence-accommodation conflict (VAC). This visual conflict results from the fact that vergence cues, but not focus cues, are simulated in near-eye display systems. Indeed, natural focus cues are not supported by any existing near-eye display. Afforded by focus-tunable optics, we explore unprecedented display modes that tackle this issue in multiple ways with the goal of increasing visual comfort and providing more realistic visual experiences.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130945288","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
G. Andrade-Barroso, Florian Nouviale, Jérôme Ardouin, É. Marchand, M. Marchal, A. Lécuyer
{"title":"Enjoy 360° vision with the FlyVIZ","authors":"G. Andrade-Barroso, Florian Nouviale, Jérôme Ardouin, É. Marchand, M. Marchal, A. Lécuyer","doi":"10.1145/2929464.2929471","DOIUrl":"https://doi.org/10.1145/2929464.2929471","url":null,"abstract":"FlyVIZ is a novel concept of wearable display device which enables to extend the human field-of-view up to 360°. With the FlyVIZ users can enjoy an artificial omnidirectional vision and see \"with eyes behind their back\"! The latest version of this approach called \"FlyVIZ_v2\" that we propose for the SIGGRAPH audience is a novel, compact, and light-weight prototype based on consumer-grade and on-the-shelf components. It assembles: a smartphone, a panoramic mirror, and an Oculus head-mounted-display in order to process a live video stream of the user's surroundings and provide a real-time omnidirectional image. We propose SIGGRAPH attendees to test this unique sensory experience, and this new kind of augmented vision.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130979285","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Graphical manipulation of human's walking direction with visual illusion","authors":"Akira Ishii, Ippei Suzuki, Shinji Sakamoto, Keita Kanai, Kazuki Takazawa, Hiraku Doi, Yoichi Ochiai","doi":"10.1145/2929464.2967926","DOIUrl":"https://doi.org/10.1145/2929464.2967926","url":null,"abstract":"Conventional research on pedestrian navigation systems has explored the possibilities of presenting information to users both visually and aurally. Existing navigation systems require users to recognize information, and then to follow directions as separate, conscious processes, which inevitably require attention to the system. This study proposes a novel method that enables pedestrians to be guided without conscious interaction with a navigational system.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"49 4","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132905047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ravish Mehra, Christoph Hohnerlein, David Perek, E. Gatti, R. DeSalvo, Sean Keller
{"title":"HapticWave: directional surface vibrations using wave-field synthesis","authors":"Ravish Mehra, Christoph Hohnerlein, David Perek, E. Gatti, R. DeSalvo, Sean Keller","doi":"10.1145/2929464.2929469","DOIUrl":"https://doi.org/10.1145/2929464.2929469","url":null,"abstract":"Recent advances in haptic technology have focused on delivering haptic vibrations to the user's hand, either through touched physical surfaces (or objects) with embedded haptic devices or through worn (or held) haptic feedback devices such as a gloves or controllers. In most of these devices, the sensation of touch is controlled by modulating either the intensity or frequency of the haptic actuation. One missing piece is a sense of direction. Our hands perceive haptic actuation over an area rather than at a single point; by using the subtle phase and amplitude differences in the vibrations felt over an extended surface, humans can detect the direction of haptic vibrations, in addition to their intensity and frequency. This added dimension of directional vibration has not received significant attention in the haptic literature.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126818600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Haruya Uematsu, D. Ogawa, Ryuta Okazaki, Taku Hachisu, H. Kajimoto
{"title":"HALUX: projection-based interactive skin for digital sports","authors":"Haruya Uematsu, D. Ogawa, Ryuta Okazaki, Taku Hachisu, H. Kajimoto","doi":"10.1145/2929464.2929479","DOIUrl":"https://doi.org/10.1145/2929464.2929479","url":null,"abstract":"Entertainment contents employing users' whole-body action is now becoming popular, along with the prevalence of low-cost whole-body motion capture systems. To add haptic modality to this context, latency becomes a critical issue because it leads to spatial disparity between the assumed contact location and tactile stimulation position. To cope with this issue, we propose to project drive signal in advance so as to eliminate latency derived from communication. We do not explicitly control each vibrator, but we project \"position-dependent, vibration strength distribution\" image. Furthermore, the system becomes highly scalable, enabling simultaneous drive of hundreds of units attached to the body.","PeriodicalId":314962,"journal":{"name":"ACM SIGGRAPH 2016 Emerging Technologies","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115863358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}