{"title":"HOLO-DOODLE: an adaptation and expansion of collaborative holojam virtual reality","authors":"T. Masson, Daffy, K. Perlin","doi":"10.1145/3089269.3089270","DOIUrl":"https://doi.org/10.1145/3089269.3089270","url":null,"abstract":"Social Media has exploded so rapidly we have to run to keep up. Whilst babies are kept occupied by iPads, kids get lost in smartphones, and teens lock themselves into their own isolating VR world, we see the need for change. We're losing the art of interaction and conversation. In this multi-person simultaneous collaborative VR installation we demonstrate that social interaction is the key component to making VR something far more positive and lasting an experience. HOLO - DOODLE, an adaptation of Holojam created by Professor Ken Perlin and his next-level team at NYU, re-jigged by VR gurus Superbright, in which sensors are strapped to the extremities of a group of people in Samsung Gear VR Headsets, given virtual paint brushes and through the medium of art and conversation become acquainted in an entirely new way. As we fully explore the parameters of our being, our art, our gender and our humanity we face ever growing walls of oppression. VR is still relatively undefined. It disturbingly makes war games become ever more real, and as a format it extends much far further than the next progression of film. We need to look at the techonology differently, use it to bring people out of their shell, to dance like noones watching, to play out a fantasy. ItâĂŹs a movement for good, and we can change the emphasis of VR to something far more positive. Allow folk to interact without the shackles of how individuals can be pre-judged in this real world.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117041624","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A brain-computer interface for extended reality interfaces","authors":"Jay J. Jantz, Adam Molnar, Ramses Alcaide","doi":"10.1145/3089269.3089290","DOIUrl":"https://doi.org/10.1145/3089269.3089290","url":null,"abstract":"Extended reality (XR) technologies, such as augmented reality (AR) and virtual reality (VR), remain limited in their interaction modalities. Prevailing interaction methods such as hand gestures and voice recognition prove awkward in XR environments, even when performing common tasks (e.g., object selection, menu navigation, and others). In contrast, an ideal interaction method would robustly and naturally translate a user's intention into both 2D and 3D environmental controls. A direct brain-computer interface (BCI) system is ideally situated to accomplish this. Neurable's technology provides a solution to maximize XR's potential, affording users real-time mental selection via dry electroencephalography (EEG).","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125959245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hallelujah: the world's first lytro VR experience","authors":"Tim Milliron, Chrissy Szczupak, Orin Green","doi":"10.1145/3089269.3089283","DOIUrl":"https://doi.org/10.1145/3089269.3089283","url":null,"abstract":"Lytro Immerge is the world's first professional Light Field solution for VR, providing true presence for live action VR through six degrees of freedom (6DoF) playback. Lytro's 6DoF playback allows the viewer to move around freely in the viewing experience, complete with parallax, view-dependent lighting, and perfect stereo in all directions no matter the orientation of the viewer's head. A Within Original, Hallelujah reimagines Leonard Cohen's most well-known song with an original a cappella composition. It is the world's first VR music experience to provide an uncompromised sense of presence with six degrees of freedom (6DoF) using Lytro Immerge. Hallelujah was created by Zach Richter, Bobby Halvorson and Eames Kolar. In Hallelujah, viewers feel a sense of intimacy and human connection with composer/performer Bobby Halvorson. For the first-time ever in virtual reality, there is real depth, character and dimension to Bobby and the live action environment due to Lytro's Light Field technology. Viewers can make eye contact with him, step forward to look closer, shift side-to-side to see around him, all with perfect stereo, perspective, and parallax. The clean and elegant environment of the piece allows this realism to set-in, deepening the connection between the viewer and Bobby as they inhabit the space together. As the experience progresses, the composition builds and envelopes the viewer in music from 360 degrees. \"Hallelujah is such a powerful song; coupled with the immersive technology from Lytro, we believe that people will feel connected to the experience in such an emotional way,\" said Zach Richter. \"Stripping the components of the piece to the essentials and using this technology creates an incredible sense of presence. It becomes real. Something impossible, like having five different versions of Bobby performing all around you and singing to you, feels normal. With virtual reality we can think about music in an entirely new way.\" Hallelujah was created using the Lytro Immerge Light Field camera, processing pipeline and playback technology. The end-to-end solution: • Represents the first live action capture system to enable Six Degrees of Freedom playback • Provides the highest quality and highest resolution • Allows for true presence: everything in the headset reacts as it should in the real world","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121962987","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Wakeboarding: an exertion game in virtual reality","authors":"Yuchen Hong, Chen-Yuan Hsieh, Keng-Ta Yang, Liwei Chan","doi":"10.1145/3089269.3089271","DOIUrl":"https://doi.org/10.1145/3089269.3089271","url":null,"abstract":"Recently there is increasing attention giving to haptic-enhanced virtual reality. Blending visual immersion with haptic feedback suggests improved engagement [Rheiner 2014]. This project starts with the integration of exertion design and bodily haptic interaction, for the increased engagement can lead to increased exertion and enjoyment in exertion games.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117115725","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Dimosthenis Gkantzos, Jan Fiess, Lukas Gotkowski, Aleksandra Todorovic
{"title":"Dream makers","authors":"Dimosthenis Gkantzos, Jan Fiess, Lukas Gotkowski, Aleksandra Todorovic","doi":"10.1145/3089269.3089278","DOIUrl":"https://doi.org/10.1145/3089269.3089278","url":null,"abstract":"Dream Makers (Figure 1) is a Virtual Reality (VR) cooperative puzzle game for two players taking place in the universe of dreams. The two players have to keep the communication alive to combine the right ingredients and reach the desired dream using a Head Mounted Display (HMD) and a tablet. The game demands physical movement, creativity, decision-making, puzzle solving and is surrounded by sound effects, lighting elements and vibration as feedback for all of the players' actions, whether right or wrong. Our goal was to make the social factor a key element in a VR experience, which is non-existent most of the times.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116129823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AR mail from Harbin","authors":"T. Nagakura, Woongki Sung","doi":"10.1145/3089269.3089274","DOIUrl":"https://doi.org/10.1145/3089269.3089274","url":null,"abstract":"AR Mail from Harbin is a small augmented reality (AR) application that works with a set of postcards. Each postcard is printed with a portion of the plan of St. Sophia, the main church in the center of the city of Harbin. A user can assemble a 3D model of the building by laying out the postcards in the proper composition. By combining a photogrammetric capture of the church, AR technology, and traditional paper media, this tool enhances visitors' experience at the historical location, increases their understanding of its spatial design, and promotes social interaction between the visitors on site and their friends in remote locations, all in a playful setting.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127073947","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Meet Mike: epic avatars","authors":"Mike Seymour, Chris Evans, Kim Libreri","doi":"10.1145/3089269.3089276","DOIUrl":"https://doi.org/10.1145/3089269.3089276","url":null,"abstract":"Meet Mike uses the latest techniques in advanced motion capture to drive complex facial rigs to enable detailed interaction in VR. This allows participants to meet, talk in VR and experience new levels of photorealistic interaction. The installation uses new advances in real time rigs, skin shaders, facial capture, deep learning and real-time rendering in UE4.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131791207","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Carmine Elvezio, Mengu Sukan, Ohan Oda, Steven K. Feiner, B. Tversky
{"title":"Remote collaboration in AR and VR using virtual replicas","authors":"Carmine Elvezio, Mengu Sukan, Ohan Oda, Steven K. Feiner, B. Tversky","doi":"10.1145/3089269.3089281","DOIUrl":"https://doi.org/10.1145/3089269.3089281","url":null,"abstract":"In many complex tasks, a remote subject-matter expert may need to assist a local user, to guide their actions on objects in the local user's environment. However, effective spatial referencing and action demonstration in a remote physical environment can be challenging. We demonstrate an approach that uses Virtual Reality (VR) or Augmented Reality (AR) for the remote expert, and AR for the local user, each wearing a stereo head-worn display (HWD). Our approach allows the remote expert to create and manipulate virtual replicas of physical objects in the local environment to refer to parts of those physical objects and to indicate actions on them. This can be especially useful for parts that are occluded or difficult to access. The remote expert can demonstrate actions in 3D by manipulating virtual replicas, supported by constraints and annotations, and point in 3D to portions of virtual replicas to annotate them.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130293945","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
K. McIntosh, John Mars, James Krahe, J. McCann, Alexander Rivera, Jake Marsico, A. Israr, Shawn Lawson, Moshe Mahler
{"title":"Magic bench: a multi-user & multi-sensory AR/MR platform","authors":"K. McIntosh, John Mars, James Krahe, J. McCann, Alexander Rivera, Jake Marsico, A. Israr, Shawn Lawson, Moshe Mahler","doi":"10.1145/3089269.3089272","DOIUrl":"https://doi.org/10.1145/3089269.3089272","url":null,"abstract":"Mixed Reality (MR) and Augmented Reality (AR) create exciting opportunities to engage users in immersive experiences, resulting in natural human-computer interaction. Many MR interactions are generated around a first-person Point of View (POV). In these cases, the user directs to the environment, which is digitally displayed either through a head-mounted display or a handheld computing device. One drawback of such conventional AR/MR platforms is that the experience is user-specific. Moreover, these platforms require the user to wear and/or hold an expensive device, which can be cumbersome and alter interaction techniques. We create a solution for multi-user interactions in AR/MR, where a group can share the same augmented environment with any computer generated (CG) asset and interact in a shared story sequence through a third-person POV. Our approach is to instrument the environment leaving the user unburdened of any equipment, creating a seamless walk-up-and-play experience. We demonstrate this technology in a series of vignettes featuring humanoid animals. Participants can not only see and hear these characters, they can also feel them on the bench through haptic feedback. Many of the characters also interact with users directly, either through speech or touch. In one vignette an elephant hands a participant a glowing orb. This demonstrates HCI in its simplest form: a person walks up to a computer, and the computer hands the person an object.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"166 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127484594","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
David Lobser, K. Perlin, Lily Fang, Christopher Romero
{"title":"FLOCK: a location-based, multi-user VR experience","authors":"David Lobser, K. Perlin, Lily Fang, Christopher Romero","doi":"10.1145/3089269.3089279","DOIUrl":"https://doi.org/10.1145/3089269.3089279","url":null,"abstract":"Flock is a shared, immersive, co-located experience made for groups of up to thirty participants. It is a gamified sandbox, an interactive music video and a ritualized LARP (Live Action Role Playing Game) as well as being one of very few examples of works in the new and unique medium of location-based, multi-user VR. This paper will describe the aspirations, challenges and lessons learned while developing a project with this new medium.","PeriodicalId":426114,"journal":{"name":"ACM SIGGRAPH 2017 VR Village","volume":"107 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124845353","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}