{"title":"Transdisciplinary approach to Augmented reality Digital heritage Mobile applications","authors":"Irina Tsokova, Adam Stephenson","doi":"10.1109/ISMAR-Adjunct54149.2021.00017","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00017","url":null,"abstract":"This paper follows the development and collaboration behind the creation of an Augmented reality (AR) mobile application. Sponsored by the Nupur arts organization and commissioned by the Arts Council England, we were asked to create a digital heritage experience telling the story of Gujarati migrants from East Africa and India during the 70s and 80s in Leicester, United Kingdom. This paper examines how being in a transdisciplinary team aided in the successful creation of 3D scanned models, which were later imported into an AR environment. Finally, this paper concludes with an analysis of the cultural implication of mobile applications such as this one and the social and communal impact they might have.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125877865","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shuwei Chen, Ben Hu, Yang Gao, Yang Liu, Zhiping Liao, Jianhua Li, Aimin Hao
{"title":"Analysis and Validation for Kinematic and Physiological Data of VR Training System","authors":"Shuwei Chen, Ben Hu, Yang Gao, Yang Liu, Zhiping Liao, Jianhua Li, Aimin Hao","doi":"10.1109/ISMAR-Adjunct54149.2021.00040","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00040","url":null,"abstract":"Virtual reality applications can provide a more immersive environment that improves users’ enthusiasm to participate. For VR-based limb motor training applications, the widespread use of VR techniques still has many challenges. On the one hand, it is not easy to evaluate the effectiveness and accuracy of VR-based programs. On the other hand, monitoring the users’ physical and mental burden during the training process is an essential but difficult task. To this end, we propose a simple and economical VR-based application for limb motor training. Kinematic data are used to monitor the user’s movements quantitatively. We also collect physiological data, including heart rate variability (HRV) and electroencephalogram (EEG) data. HRV data are used to assess physical fatigue in real-time and EEG data can be used to detect mental fatigue in the future. Based on this application, we have conducted many experiments and user studies to verify the kinematic data monitoring accuracy and the feasibility of fatigue detecting. The results have demonstrated that VR-based solutions for limb motor training have good kinematic data measurement precision. Meanwhile, the physiological data demonstrated that the VR-based rehabilitation does not cause too much physical fatigue to participants.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124711117","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cuboid-Shaped Space Recognition from Noisy Point Cloud for Indoor AR Workspace","authors":"Ki-Sik Kim, Jong-Seung Park","doi":"10.1109/ISMAR-Adjunct54149.2021.00120","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00120","url":null,"abstract":"This paper proposes a geometric shape recognition method for a cuboid-shaped indoor space from a point cloud. We first acquire a point cloud using Visual SLAM for a spherical video. Then, we obtain a geometric model for the indoor space by finding a best-fit cuboid from the point cloud and adjusting it to the real-world environment. The geometric model for the indoor space is updated using the recognized cuboid data. We implemented the proposed method and built a prototype application, which is a simple FPS AR game. Our experiments show that the proposed method provides accurate geometric estimation even when there are a lot of noisy map points in the point cloud.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"17 1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125065498","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Impact of Gaze Cues in Mixed Reality Collaborations","authors":"Allison Jing","doi":"10.1109/ISMAR-Adjunct54149.2021.00111","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00111","url":null,"abstract":"Gaze is one of the most important communication cues in performing physical tasks in both face-to-face and remote collaboration. Dynamic gaze information can indicate the user’s intention, focus, and current attention while visualising this information can often compensate for other communication channels that are not always readily available. Previous studies have shown that sharing and understanding another person’s gaze cues can benefit mutual awareness and task coordination in traditional 2D displays. However, researchers have not fully explored the impact of the virtual representations of gaze cues using Mixed Reality technologies. In this doctoral consortium presentation, I will present eyemR-Vis, a 360 panoramic Mixed Reality (MR) remote collaboration system that shares gaze behavioural visualisations between a local worker and a remote collaborator. In the paper I discuss the PhD research motivation, background material, recently published study results, and plans for future work.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129406393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Orestis Sarakatsanos, E. Chatzilari, S. Nikolopoulos, Y. Kompatsiaris, Dongjoe Shin, David Gavilan, Jim Downing
{"title":"A VR Application for the Virtual Fitting of Fashion Garments on Avatars","authors":"Orestis Sarakatsanos, E. Chatzilari, S. Nikolopoulos, Y. Kompatsiaris, Dongjoe Shin, David Gavilan, Jim Downing","doi":"10.1109/ISMAR-Adjunct54149.2021.00018","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00018","url":null,"abstract":"In this paper, we present an interactive Virtual Reality (VR) application for fashion designers that aims to offer an immersive environment for accurately visualizing and testing garments in the design process. The VR environment simulates a designer’s fitting room, offering close up inspection of an avatar wearing a garment in a plethora of movements. This, in turn, offers to the designer a better perspective of how the final product will look, fit and behave. In implementing the proposed application, we first review the option in game engines and libraries to support a realistic avatargarment simulation, and propose a methodology for transforming the production-designed garments into a format that can be used in game engine cloth simulations.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129558630","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jennifer Healey, Duotun Wang, Curtis Wigington, Tong Sun, Huaishu Peng
{"title":"A Mixed-Reality System to Promote Child Engagement in Remote Intergenerational Storytelling","authors":"Jennifer Healey, Duotun Wang, Curtis Wigington, Tong Sun, Huaishu Peng","doi":"10.1109/ISMAR-Adjunct54149.2021.00063","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00063","url":null,"abstract":"We present a mixed reality (MR) storytelling system designed specifically for multi-generational collaboration with child engagement as a key focus. Our \"Let’s Make a Story\" system comprises a two-sided experience that brings together a remote adult and child to tell a story collaboratively. The child has a mixed reality phone-based application with an augmented manipulative that controls the story’s main character. The remote adult participates through a web-based interface. The adult reads the story to the child and helps the child play the story game by providing them with items they need to clear the scenes.In this paper, we detail the implementation of our system and the results of a user study. Eight remote adult-child pairs experienced both the MR and a traditional paper-based storytelling system. To measure engagement, we used questionnaire analysis, engagement time with the story activity, and the word count of the child’s description of how the story should end. We found that children uniformly preferred the MR system, spent more time engaged with the MR system, and used more words to describe how the story should end incorporating details from the game.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130545113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Yinshu Zhao, N. Baghaei, Alexander Schnack, Lehan Stemmet
{"title":"Assessing Telepresence, Social Presence and Stress Response in a Virtual Reality Store","authors":"Yinshu Zhao, N. Baghaei, Alexander Schnack, Lehan Stemmet","doi":"10.1109/ISMAR-Adjunct54149.2021.00020","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00020","url":null,"abstract":"The development of immersive Virtual Reality (VR) has provided users around the globe with a highly realistic virtual world experience. Since its first use, extensive research has been conducted with the attempt to understand how human behaviour in virtual environments compares to the real world. Studies have shown that people exhibit similar behaviours and reactions in a number of scenarios including virtual shopping, thus making it a promising tool for researchers to study in-store shopper behaviour. This paper outlines ideas on how store atmospherics can affect the user experience, stress levels and behaviour in a virtual store environment, developed using the Unreal Game Engine. The presence/absence of avatars can be investigated as an important aspect of store atmospherics and a potential antecedent of perceived presence in a simulated retail environment. These insights will be useful for retailers in that they can guide development and improvement of virtual simulated shopping experiences, using elements of telepresence and social presence, to enhance the consumer shopping experience and their own retail strategy.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132900933","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Reproduction of Environment Reflection using Extrapolation of Front Camera Images in Mobile AR","authors":"Shun Odajima, T. Komuro","doi":"10.1109/ISMAR-Adjunct54149.2021.00068","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00068","url":null,"abstract":"In this paper, we propose a method to reproduce the reflection of a real scene on a virtual object using only the images captured by a camera attached to the front of a mobile device. Since it is not possible to acquire the entire scene using only the front camera, the area surrounding the front camera image is extrapolated to obtain a sufficient scene for reflection. Image transformation using deep neural networks is used for extrapolation, and high-quality and stable extrapolation is realized by using the extrapolation results in the previous frame. As a result of experiment that evaluated the quality of extrapolation and reproduction of reflection, we confirmed that both extrapolated images and reproduced reflection looked natural. We also conducted an experiment to evaluate users' impression of the reflection. The results showed that the proposed method was effective for participants who knew about AR in naturalness of reflection and material perception.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126302907","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Laura Boffi, G. Mincolelli, Simone Bertucci, Lorenzo Gammarota, Fabio Pes, Marco Garofoli
{"title":"Co-Drive: the experience of a shared car trip between a driver and a remote passenger","authors":"Laura Boffi, G. Mincolelli, Simone Bertucci, Lorenzo Gammarota, Fabio Pes, Marco Garofoli","doi":"10.1109/ISMAR-Adjunct54149.2021.00118","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00118","url":null,"abstract":"Co-Drive is a service concept that allows social virtual travelling by car between a driver of a vehicle and a remote passenger connected via virtual reality from home. The Co-Drive concept enables novel social interactions between a driver and a remote passenger who are unknown to each other and it aims to foster new social encounters, for example intergenerational ones between elderly remote passengers (with reduced mobility and travel possibilities) and younger drivers. At ISMAR 2021, Co-Drive will be demonstrated as a way to foster casual and unfocused encounters between unknown conference attendees.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"30 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122210243","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Perceived Transparency in Optical See-Through Augmented Reality","authors":"Lili Zhang, M. Murdoch","doi":"10.1109/ISMAR-Adjunct54149.2021.00033","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct54149.2021.00033","url":null,"abstract":"Optical see-through (OST) displays overlay rendered images to the real-world background creating augmented reality (AR). But the blending between the rendering and the real-world background introduces perceived transparency. The increased luminance on the rendering decreases the background contrast, and reduces the perceived transparency, which is not incorporated in existing color appearance models or display color management pipelines. We studied the perceived transparency in AR focusing on the interaction between the rendering and the patterned background in various luminance, contrast, and wave forms. In addition to AR contrast, we also examined simulated contrast modulation by changing the luminance amplitude. Two psychophysical experiments were conducted to quantify the perceived transparency. The first experiment measured a perceived transparency scale using direct scaling, and the second experiment evaluated the transparency equivalency between two methods of contrast modulation. The result showed that the two methods evoke different transparency perceptions. The background contrast affects the perceived AR transparency significantly, while the background luminance and wave form do not. We proposed a model predicting the perceived transparency based on the first experiment result from AR luminance and background contrast. The model was verified with the second experiment and showed good prediction. Our model presented a new perceptual dimension in OST AR and can possibly be incorporated into color management pipeline to improve AR image quality.","PeriodicalId":244088,"journal":{"name":"2021 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"130 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123073076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}