{"title":"Investigating Three-dimensional Directional Guidance with Nonvisual Feedback for Target Pointing Task","authors":"SeungA Chung, Kyungyeon Lee, U. Oh","doi":"10.1109/ISMAR-Adjunct51615.2020.00061","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00061","url":null,"abstract":"While directional guidance is essential for spatial navigation, little has been studied about providing nonvisual cues in 3D space for individuals who are blind or have limited visual acuity. To understand the effects of different nonvisual feedback for 3D directional guidance, we conducted a user study with 12 blind-folded participants. They were asked to search for a virtual target in a 3D space with a laser pointer as quickly as possible under 6 different feedback designs varying the feedback mode (beeping vs. haptic vs. beeping+haptic) and the presence of a stereo sound. Our findings show that beeping sound feedback with and without haptic feedback outperforms the mode where only haptic feedback is provided. We also found that stereo sound feedback generated from a target significantly improves both the task completion time and travel distance. Our work can help people who are blind or have limited visual acuity to understand the directional guidance in a 3D space.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132623155","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tobias Feigl, Lisa Gruner, Christopher Mutschler, Daniel Roth
{"title":"Real-Time Gait Reconstruction For Virtual Reality Using a Single Sensor","authors":"Tobias Feigl, Lisa Gruner, Christopher Mutschler, Daniel Roth","doi":"10.1109/ISMAR-Adjunct51615.2020.00037","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00037","url":null,"abstract":"Embodying users through avatars based on motion tracking and reconstruction is an ongoing challenge for VR application developers. High quality VR systems use full-body tracking or inverse kinematics to reconstruct the motion of the lower extremities and control the avatar animation. Mobile systems are limited to the motion sensing of head-mounted displays (HMDs) and typically cannot offer this.We propose an approach to reconstruct gait motions from a single head-mounted accelerometer. We train our models to map head motions to corresponding ground truth gait phases. To reconstruct leg motion, the models predict gait phases to trigger equivalent synthetic animations. We designed four models: a threshold-based, a correlation-based, a Support Vector Machine (SVM) -based and a bidirectional long-term short-term memory (BLSTM) -based model. Our experiments show that, while the BLSTM approach is the most accurate, only the correlation approach runs on a mobile VR system in real time with sufficient accuracy. Our user study with 21 test subjects examined the effects of our approach on simulator sickness and showed significantly less negative effects on disorientation.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"186 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114089401","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Shuwei Chen, Ben Hu, Yang Gao, Zhiping Liao, Jianhua Li, Aimin Hao
{"title":"Lower Limb Balance Rehabilitation of Post-stroke Patients Using an Evaluating and Training Combined Augmented Reality System","authors":"Shuwei Chen, Ben Hu, Yang Gao, Zhiping Liao, Jianhua Li, Aimin Hao","doi":"10.1109/ISMAR-Adjunct51615.2020.00064","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00064","url":null,"abstract":"Augmented/virtual reality applications can provide immersive and interactive virtual environment for motor rehabilitation using the collaborative stimulations of multiple sensory channels such as sight, hearing, and movement, enhance the rehabilitation effect through repetitions, feedbacks, and encouragement. In this paper, we propose an evaluating and training integrated application for the rehabilitation of patients with lower limb balance disorder. The AR-based evaluation module visualizes the limits of lower limbs patients’ balance abilities and provides quantitative data to their therapists, then rehabilitation therapists can customize personalized VR training games accordingly.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129969498","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Locomotive and Cognitive Trade-Offs for Target-based Travel","authors":"Chengyuan Lai, Afham Ahmed Aiyaz, Ryan P. McMahan","doi":"10.1109/ISMAR-Adjunct51615.2020.00034","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00034","url":null,"abstract":"Target-based travel has become a common travel metaphor for virtual reality (VR) applications. Three of the most common target-based travel techniques include Point-and-Instant-Teleport (Teleport), Point-and-Walk-Motion (Motion), and Automatic- Walk-Motion (Automatic). We present a study that employed a dual-task methodology to investigate the user performance characteristics and cognitive loads of the three target-based travel techniques, in addition to several subjective measures. Our results indicate that the Teleport technique afforded the best user travel performances, but that the Automatic technique afforded the best cognitive load.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126259308","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"MiXR: A Hybrid AR Sheet Music Interface for Live Performance","authors":"Shalva Kohen, Carmine Elvezio, Steven K. Feiner","doi":"10.1109/ISMAR-Adjunct51615.2020.00035","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00035","url":null,"abstract":"Musicians face a number of issues when performing live, including organizing and annotating sheet music. This can be an unwieldy process, as musicians need to simultaneously read and manipulate sheet music and interact with the conductor and other musicians. Augmented Reality can provide a way to ease some of the more cumbersome aspects of live performance and practice. We present MiXR, a novel interactive system that combines an AR headset, a smartphone, and a tablet to allow performers to intuitively and efficiently manage and annotate virtual sheet music in their physical environment. We discuss our underlying motivation, the interaction techniques supported, and the system architecture.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124471232","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Natalie Hube, O. Lenz, Lars Engeln, Rainer Groh, M. Sedlmair
{"title":"Comparing Methods for Mapping Facial Expressions to Enhance Immersive Collaboration with Signs of Emotion","authors":"Natalie Hube, O. Lenz, Lars Engeln, Rainer Groh, M. Sedlmair","doi":"10.1109/ISMAR-Adjunct51615.2020.00023","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00023","url":null,"abstract":"We present a user study comparing a pre-evaluated mapping approach with a state-of-the-art direct mapping method of facial expressions for emotion judgment in an immersive setting. At its heart, the pre-evaluated approach leverages semiotics, a theory used in linguistic. In doing so, we want to compare pre-evaluation with an approach that seeks to directly map real facial expressions onto their virtual counterparts. To evaluate both approaches, we conduct a controlled lab study with 22 participants. The results show that users are significantly more accurate in judging virtual facial expressions with pre-evaluated mapping. Additionally, participants were slightly more confident when deciding on a presented emotion. We could not find any differences regarding potential Uncanny Valley effects. However, the pre-evaluated mapping shows potential to be more convenient in a conversational scenario.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"191 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132630202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Machine Intelligence Matters: Rethink Human-Robot Collaboration Based on Symmetrical Reality","authors":"Zhenliang Zhang, Xuejiao Wang","doi":"10.1109/ISMAR-Adjunct51615.2020.00066","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00066","url":null,"abstract":"Human-robot collaboration could be valuable in some challenging tasks. Previous researches only consider the human-centered systems, but there will be many changes in the symmetrical reality (SR) systems because there are two perceptual centers in symmetrical reality. In this paper, we introduce the contents of the symmetrical reality-based human-robot collaboration and interpret the human- robot collaboration from the perspective of equivalent interaction. By analyzing task definition in symmetrical reality, we present the special features of human-robot collaboration. Furthermore, there are many fields in which the symmetrical reality can produce a remarkable effect, we only list some typical applications, such as service robots, remote training, interactive exhibition, digital assistants, companion robots, the immersive entertainment community and so forth. The current situation and future development of this framework are also analyzed to provide a kind of guidance for researchers.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126884311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Florian Schuster, Uwe Sponholz, Bastian Engelmann, Jan Schmitt
{"title":"A User Study on AR-assisted Industrial Assembly","authors":"Florian Schuster, Uwe Sponholz, Bastian Engelmann, Jan Schmitt","doi":"10.1109/ISMAR-Adjunct51615.2020.00047","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00047","url":null,"abstract":"The utilization of modern assistance system e.g. Augmented Reality (AR) has reached into industrial assembly scenarios. Beside the technical realization of AR assistance in the assembly scene the worker has to accept the new technology. Only both, user acceptance and technical user-interface design leads to an optimized overall system. Hence, this contribution gives a brief literature overview and analysis about AR acceptance and acceptance modeling. Then, a proprietary model for acceptance measurement is developed, which includes and synthesizes previous models (TAM and UTAUT) and simplifies them considerably for the purpose of industrial assembly. Following, a laboratory experiment is set-up in the FHWS c-Factory, which is a smart, IoT-based production environment. A survey and an assembly cycle time measurement is conducted to collect data to characterize AR assistance. The study participants assemble a toy truck once without and once with AR support. The evaluation shows that the mean assembly time decreases. The results show also, that AR is accepted by the participants supporting their work.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115494047","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
A. Guinet, Guillaume Bouyer, S. Otmane, E. Desailly
{"title":"Towards an AR game for walking rehabilitation: Preliminary study of the impact of augmented feedback modalities on walking speed","authors":"A. Guinet, Guillaume Bouyer, S. Otmane, E. Desailly","doi":"10.1109/ISMAR-Adjunct51615.2020.00075","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00075","url":null,"abstract":"Designing a serious game for walking rehabilitation requires compliance with the theory of motor learning. Motivation, repetition, variability and feedback are key elements in improving and relearning a new walking pattern. As a preamble to the development of an AR rehabilitation game, and in order to choose the most effective feedback to provide to the patient, this article presents a preliminary study on the impact of presentation modalities on walking speed. We investigate which visual concurrent feedback modalities allows to reach and maintain a target speed (maximum or intermediate). Our first results on children with motor disabilities (n=10) show that some modalities improved walking performance and helped patients to better control their walking speed. In particular, a combination of targets anchored in the real world with a time indication seems to be effective in maintaining maximum walking speed, while simple moving objects could be used to control speed.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"91 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129601387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using Augmented Reality to Explore Museum Artifacts","authors":"Rayed A. Alakhtar","doi":"10.1109/ISMAR-Adjunct51615.2020.00083","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00083","url":null,"abstract":"Many technologies have been used recently to enhance the tourism experience, and Augmented Reality (AR) has proven to be one of the more promising. This research study aims to use AR to enhance the tourism experience at heritage sites, and in particular to show that AR can provide a richer understanding of how historical artifacts were used. This research adopts an exploratory approach comprising: an exploratory survey; an exploratory interview study; interview study approach in Saudi Arabia; the design of a User Interface (UI); and an evaluation of user experience. The expected outcomes of this research are: conclusions drawn from the interview data; and secondly, a User Interface application for the General Entertainment Authority of audi Arabia.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127645883","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}