{"title":"Evaluation of Stroke Assessment in Simulated Virtual World","authors":"M. Mancosu","doi":"10.1109/ISMAR-Adjunct51615.2020.00087","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00087","url":null,"abstract":"In this paper we will present our application, which consists in a simulator for assessing stroke cases in Virtual Reality. Our project aims to increase the efficiency of the approach to training using serious games and virtual environments instead of traditional teaching methods. This paper describes the results obtained from a survey performed on our serious game application comparing two different simulation examples of a stroke case. The perception of the animation and the environment by the participant are discussed in conjunction with further development of the tool.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127978628","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Developing an eXtended Reality platform for Immersive and Interactive Experiences for Cultural Heritage: Serralves Museum and Coa Archeologic Park","authors":"Manuel Silva, Luís Teixeira","doi":"10.1109/ISMAR-Adjunct51615.2020.00084","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00084","url":null,"abstract":"Digital Heritage and Digital Humanities focus on distinct typologies of heritage: tangible and intangible Cultural Heritage (CH) objects and their preservation, education, and research versus the application of digital technologies to support research in the humanities. Both allow scholars to go beyond textual sources to integrate digital tools into the humanistic study. This project aims at supporting a new way of experiencing CH in the Serralves Museum and Coa Archeologic Park through more involving and culturally-qualified user experience. The main goal is to understand the potential of eXtended Reality within CH while also proposing the idea of developing a digital experience platform: an authoring tool based on an engine with core experiences functions that can be applied for developing multiple experiences for CH. This platform will contribute to new approaches, technologies, and tools for creating, processing, and delivering immersive and interactive content for engaging and meaningful experiences in these specific CH environments.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128766349","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Marc Fischer, Christoph Leuze, Stephanie L. Perkins, J. Rosenberg, B. Daniel, A. Martin-Gomez
{"title":"Evaluation of Different Visualization Techniques for Perception-Based Alignment in Medical AR","authors":"Marc Fischer, Christoph Leuze, Stephanie L. Perkins, J. Rosenberg, B. Daniel, A. Martin-Gomez","doi":"10.1109/ISMAR-Adjunct51615.2020.00027","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00027","url":null,"abstract":"Many Augmented Reality (AR) applications require the alignment of virtual objects to the real world; this is particularly important in medical AR scenarios where medical imaging information may be displayed directly on a patient and is used to identify the exact locations of specific anatomical structures within the body. For optical see-through AR, alignment accuracy depends both on the optical parameters of the AR display as well as the visualization parameters of the virtual model. In this paper, we explore how different static visualization techniques influence users’ ability to perform perception-based alignment in AR for breast reconstruction surgery, where surgeons must accurately identify the locations of several perforator blood vessels while planning the procedure. We conducted a pilot study in which four subjects used four different visualization techniques with varying degrees of opaqueness and brightness as well as outline contrast to align virtual replicas of the relevant anatomy to their 3D-printed counterparts. We collected quantitative scores on spatial alignment accuracy using an external tracking system and qualitative scores on user preference and perceived performance. Results indicate that the highest source of alignment error was along the depth dimension, with users consistently overestimating depth when aligning the virtual renderings. The majority of subjects preferred visualization techniques rendered with lower levels of opaqueness and brightness as well as higher outline contrast, which were also found to support more accurate alignment.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"65 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117340064","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Usability Considerations of Hand Held Augmented Reality Wiring Tutors","authors":"B. Herbert, W. Hoff, M. Billinghurst","doi":"10.1109/ISMAR-Adjunct51615.2020.00078","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00078","url":null,"abstract":"Electrical repair tasks across domains use a common set of skills that combine problem solving, fine motor and spatial skills. Augmented Reality (AR) helps develop these skills by overlaying virtual objects on the real-world. So we designed a hand held AR-based wiring tutor which incorporates Constraint-Based Modelling (CBM) paradigms to detect learner errors in an electrical wiring task. We compared the performance and usability of our prototype with a state of the art hand held AR-based training system with a consistent user interface (UI) design, which lacks CBM approaches. Although, the CBM condition had significantly lower usability scores than the state of the art, participants using the CBM approach reported higher practical scores. We discuss reasons for the usability differences, including potential for positive perceptions of the system to be distorted by critical feedback needed to regulate learning in the electrical wiring domain. Next steps would include using the same system to evaluate the theoretical and practical learning outcomes.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"80 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133038187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Wolfgang A. Mehringer, M. Wirth, Stefan Gradl, Luis S. Durner, M. Ring, A. Laudanski, B. Eskofier, G. Michelson
{"title":"An Image-Based Method for Measuring Strabismus in Virtual Reality","authors":"Wolfgang A. Mehringer, M. Wirth, Stefan Gradl, Luis S. Durner, M. Ring, A. Laudanski, B. Eskofier, G. Michelson","doi":"10.1109/ISMAR-Adjunct51615.2020.00018","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00018","url":null,"abstract":"Strabismus is a visual disorder characterized by eye misalignment. The effect of Panum’s Fusional Area (PFA) compensates for small misalignments. However, prominent misalignments affect binocular vision and when present in childhood it may lead to amblyopia, a developmental disorder of the visual system. With the advent of Virtual Reality (VR) technology, possibilities for novel binocular treatments to amblyopia arise in which the measurement of strabismus is crucial to correctly compensate for it. Thus, VR yields great potential due to the ability of displaying content to each eye independently. Major research in VR addresses this topic using eye-tracking while there is a paucity of research on image-based assessment methods. In this work, we propose a VR application for measuring strabismus in nine lines of sight. We conducted a study with 14 healthy participants to evaluate the system under two conditions: no strabismus and an artificial deviation induced by prism lenses. Further, we evaluated the effect of PFA on the system by measuring its extent in horizontal and vertical lines of sight. Results show significant difference between the expected deviation induced by prism lenses and the measured deviation. The existing difference within the measurements can be explained with the recorded extent of the PFA.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133499017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"\"Kapow!\": Augmenting Contacts with Real and Virtual Objects Using Stylized Visual Effects","authors":"V. Mercado, Jean-Marie Normand, A. Lécuyer","doi":"10.1109/ISMAR-Adjunct51615.2020.00043","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00043","url":null,"abstract":"We propose a set of stylized visual effects (VFX) meant to improve the sensation of contact with objects in Augmented Reality (AR). Various graphical effects have been conceived, such as virtual cracks, virtual wrinkles, or even virtual onomatopoeias inspired by comics. The VFX are meant to augment the perception of contact, with either real or virtual objects, in terms of material properties or contact location for instance. These VFX can be combined with a pseudohaptics approach to further increase the range of simulated physical properties of the touched materials. An illustrative setup based on a HoloLens headset was designed, in which our proposed VFX could be explored. The VFX appear each time a contact is detected between the user’s finger and one object of the scene. Such VFX- based approach could be introduced in AR applications for which the perception and display of contact information are important.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122777374","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Augmented Reality Narratives for Post-Traumatic Stress Disorder Treatment","authors":"Liu Chang, Á. Cassinelli, C. Sandor","doi":"10.1109/ISMAR-Adjunct51615.2020.00086","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00086","url":null,"abstract":"Globally, it is estimated that up to 1 billion children aged 2-17 years, have experienced physical, sexual, or emotional violence in the past year [1], and 30% of the abused child is likely to develop Post-traumatic stress disorder (PTSD) [2]; 354 million adult war survivors are suffering from PTSD [3]; At where the natural disaster occurred, 70.7% of the survivor will suffer from acute PTSD [4]. PTSD has not only high prevalence but also high lethality, which is accompanied by multiple physical and mental comorbidities as well as strong suicidal tendencies [5]-[7]. This doctoral research aims to contribute to the development of PTSD treatment by investigating the possibility of adopting Augmented Reality (AR) narrative in treating PTSD. This four-year research project consists of three steps. In the first stage of research, we will conduct a comparative study between AR and VR narratives with healthy participants to verify whether AR narratives work better in eliciting the emotional engagement of the participants than VR narratives. In the second stage, we will create a system that integrates AR narratives with prolonged exposure (PE) treatment and experiment it with PTSD patients to verify its treatment efficacy. In the final stage, a semi-automatic and patient-authored AR system is expected to be achieved, through which the patients can design their own exposure environment via voice input. This project will provide valuable experimental samples and scientific analysis for the research of psychotherapy, narrative studies, and AR application.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"124 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131641573","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jisu Kim, Mario Lorenz, Sebastian Knopp, Philipp Klimant
{"title":"Industrial Augmented Reality: Concepts and User Interface Designs for Augmented Reality Maintenance Worker Support Systems","authors":"Jisu Kim, Mario Lorenz, Sebastian Knopp, Philipp Klimant","doi":"10.1109/ISMAR-Adjunct51615.2020.00032","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00032","url":null,"abstract":"Maintenance departments of producing companies in most industrial countries are facing challenges originating from an aging workforce, increasing product variety, and the pressure to increase productivity. We present the concepts and the user interface (UI) designs for two Augmented Reality (AR) applications, which help to tackle these issues. An AR Guidance System will allow new and unexperienced staff to perform medium to highly complex maintenance tasks, which they currently incapable to. The AR Remote Service System enables technicians at the machine to establish a voice/video stream with an internal or external expert. The video stream can be augmented with 3D models and drawings so that problems can be solved remotely and more efficiently. A qualitative assessment with maintenance managers and technicians from three producing companies rated the AR application concept as beneficial and the UI designs as very usable.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"475 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123251512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
S. Skovsen, Harald Haraldsson, A. Davis, H. Karstoft, Serge J. Belongie
{"title":"Decoupled Localization and Sensing with HMD-based AR for Interactive Scene Acquisition","authors":"S. Skovsen, Harald Haraldsson, A. Davis, H. Karstoft, Serge J. Belongie","doi":"10.1109/ISMAR-Adjunct51615.2020.00053","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00053","url":null,"abstract":"Real-time tracking and visual feedback offer interactive AR-assisted capture systems as a convenient and low-cost alternative to specialized sensor rigs and robotic gantries. We present a simple strategy for decoupling localization and visual feedback in these applications from the primary sensor being used to capture the scene. Our strategy is to use an AR HMD and 6-DOF controller for tracking and feedback, synchronized with a separate primary sensor for capturing the scene. This approach allows for convenient real-time localization of sensors that cannot do their own localization (e.g., microphones). In this poster paper, we present a prototype implementation of this strategy and investigate the accuracy of decoupled tracking by mounting a high resolution camera as the primary sensor, and comparing decoupled runtime pose estimates to the pose estimates of a high-resolution offline structure from motion.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"406 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122799134","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Arlindo Gómes, L. Figueiredo, W. Correia, V. Teichrieb, J. Quintino, F. Q. Silva, André L. M. Santos, Helder Pinho
{"title":"Extended by Design: A Toolkit for Creation of XR Experiences","authors":"Arlindo Gómes, L. Figueiredo, W. Correia, V. Teichrieb, J. Quintino, F. Q. Silva, André L. M. Santos, Helder Pinho","doi":"10.1109/ISMAR-Adjunct51615.2020.00029","DOIUrl":"https://doi.org/10.1109/ISMAR-Adjunct51615.2020.00029","url":null,"abstract":"Through the last decade, the creation of extended reality (XR) solutions has significantly increased due to the advent of cheaper, more advanced, and accessible instruments like smartphones, headsets, platforms, development kits, and engines. For instance, the number of GitHub repositories for XR related projects jumped from 51 in 2010 to over 15,000 in 2020. At the same time, the developer community approaches the creation of XR applications using inherited design processes and methods from past mainstream platforms such as web, mobile, or even product design. Unfortunately, those platforms do not consider the spatial aspects of these applications. In this paper, we present a revisited design process and a toolkit focused on the challenges innate to XR, that aims to help beginners and experienced teams in the creation of applications and interactions in Virtual, Augmented, and Mixed Reality. We also present a compendium of 113 techniques and 118 guidelines and a set of canvases that guides users through the process, preventing them from skipping important tasks and discoveries. At last, we present a pilot case where we accompany a team with developers and designers running our process and using our toolkit for the first time, showing the benefits of a process that strikes specific issues of XR apps.","PeriodicalId":433361,"journal":{"name":"2020 IEEE International Symposium on Mixed and Augmented Reality Adjunct (ISMAR-Adjunct)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129056903","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}