{"title":"VirtualPhobia: A Model for Virtual Therapy of Phobias","authors":"Sherazade Shunnaq, M. Raeder","doi":"10.1109/SVR.2016.20","DOIUrl":"https://doi.org/10.1109/SVR.2016.20","url":null,"abstract":"This study proposes a tool model for phobia treatments using exposure therapy and virtual reality. The model goal is to represent a system that enables the treatment of several phobias with different techniques, through the immersion of the patient in a virtual environment with head-mounted displays, supporting the addition of new modules of techniques or phobias. The present work seeks to propose a broader and more flexible virtual therapy system and obtaining a more current solution. For the evaluation of the applicability of this model, a prototype was developed with support to the treatments of flooding, implosion, rationalization and systematic desensitization, as well as support to more than one phobia. The prototype was tested with volunteers to evaluate the levels of reality, immersion and anxiety caused to analyze the possibility of the use of this tool to support phobia treatments. At the end of this study, the results showed that the developed model is applicable, and that virtual reality has great strength when applied to human's psychology, being that 80% of the participants reported feeling afraid or having other characteristics of anxiety.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128434395","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SimImplanto - A Virtual Dental Implant Training Simulator","authors":"L. Pires, Yvens R. Serpa, M. A. Rodrigues","doi":"10.1109/SVR.2016.41","DOIUrl":"https://doi.org/10.1109/SVR.2016.41","url":null,"abstract":"Ongoing progress in the area of virtual reality and computer simulation has been providing applications that show tremendous promise in overcoming most of the deficiencies associated with training using cadaver parts and plastic artifacts. In this work, we present a 3D Virtual Drilling Simulator of the edentulous space of the dental arch in oral rehabilitation, controlled by the Novint Falcon haptic device. We developed the application from a dental cast (maxilla and mandible), which was scanned for the generation of the 3D model. Additionally, we have used a CT scan for the simulation of different bone densities of the implant region and respective drilling resistances. Preliminary tests were also performed by a specialist in the field, who has validated the perceived tactile and visual feedback and recognized the relevance of the simulator as a support tool for training students in Implant Dentistry.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114627517","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
João Gabriel Abreu, J. M. Teixeira, L. Figueiredo, V. Teichrieb
{"title":"Evaluating Sign Language Recognition Using the Myo Armband","authors":"João Gabriel Abreu, J. M. Teixeira, L. Figueiredo, V. Teichrieb","doi":"10.1109/SVR.2016.21","DOIUrl":"https://doi.org/10.1109/SVR.2016.21","url":null,"abstract":"The successful recognition of sign language gestures by computer systems would greatly improve communications between the deaf and the hearers. This work evaluates the usage of electromyogram (EMG) data provided by the Myo armband as features for classification of 20 stationary letter gestures from the Brazilian Sign Language (LIBRAS) alphabet. The classification was performed by binary Support Vector Machines (SVMs), trained with a one-vs-all strategy. The results obtained show that it is possible to identify the gestures, but substantial limitations were found that would need to be tackled by further studies.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"137 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121950463","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Elen Collaço de Oliveira, P. Bertrand, M. R. Lesur, Priscila Palomo, M. Demarzo, A. Cebolla, R. Baños, R. Tori
{"title":"Virtual Body Swap: A New Feasible Tool to Be Explored in Health and Education","authors":"Elen Collaço de Oliveira, P. Bertrand, M. R. Lesur, Priscila Palomo, M. Demarzo, A. Cebolla, R. Baños, R. Tori","doi":"10.1109/SVR.2016.23","DOIUrl":"https://doi.org/10.1109/SVR.2016.23","url":null,"abstract":"Virtual reality has been widely explored to immerse users in environments other than those considered to be their surrounding realities. We discuss the possibility of immersion not in another environment but in another person's body. The power of body swap illusion opens up a great deal of possibilities and applications in several areas, such as neuroscience, psychology, and education. For this experiment, we used a low budget system that reproduces a person's head movements as if one's own head were in another body viewed through a head mounted display (HMD) while having body agency, i.e., controlling the movements of another real body as if it was a \"real avatar\". In this pilot study we describe the tool in details and discuss its feasibility and preliminary results based on the analysis of the participants' perceptions collected through validated questionnaires and in-depth interviews. We observed that the system does promote higher levels of realism and involvement (\"presence\") compared with an immersion experience without body agency. Moreover, spontaneous declarations by the participants also showed how impactful this experience may be. Future applications of the tool are discussed.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134070650","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
P. V. F. Paiva, L. Machado, A. Valença, R. Moraes, Thiago V. V. Batista
{"title":"Enhancing Collaboration on a Cloud-Based CVE for Supporting Surgical Education","authors":"P. V. F. Paiva, L. Machado, A. Valença, R. Moraes, Thiago V. V. Batista","doi":"10.1109/SVR.2016.16","DOIUrl":"https://doi.org/10.1109/SVR.2016.16","url":null,"abstract":"In recent decades, a greater number of Collaborative Virtual Environments (CVEs) have been developed and sought for collaborative training of medical personnel in VR. Observing the needs identified in the literature, a multidisciplinary team developed a collaborative simulator for education and assessment of student groups in basic surgical routines, called SimCEC. The system was developed according to a strict methodology of design. Considering important needs with regard to storage guarantee, consistency and availability of the CVE, SimCEC was recently added by a cloud data distribution architecture for managing multiple virtual rooms for training of student teams, enabling collaboration among different areas of health. This paper discusses the theoretical and practical aspects of such tool, as well as its advantages and the new possibilities offered for surgical education area.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134383820","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Virtual Environment for Drone Pilot Training Using VR Devices","authors":"Guilherme Riter Postal, W. Pavan, Rafael Rieder","doi":"10.1109/SVR.2016.39","DOIUrl":"https://doi.org/10.1109/SVR.2016.39","url":null,"abstract":"This paper presents the development of a Virtual Reality environment for drone pilot training using interaction devices. We built an immersive interface in order to enhance the user experience in UAV training tasks comparing to the traditional control interfaces. To reach this aim, we chosen the Microsoft Kinect to control the drone and the Oculus Rift to visualize the scene, as well as keyboard, mouse and joystick support. As a result, we created a virtual environment in which users can train the drone pilot safely. Despite a good impression on preliminary tests, the solution still needs an evaluation by users and improvements in Physics simulations.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114963086","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Development of an Open Source Software for Real Time Optical Motion Capture","authors":"David Lunardi Flam, J. Gomide, A. Araújo","doi":"10.1109/SVR.2016.28","DOIUrl":"https://doi.org/10.1109/SVR.2016.28","url":null,"abstract":"This paper discusses the development and release of an open source real time motion capture system for character animation, the OpenMoCap. The software carries out the entire pipeline for acquisition of motion data and its output in appropriate formats to modeling and animation software. OpenMoCap uses the passive optical capture technique to follow markers positions in a scene across the time, and its development is based on digital image analysis techniques. The defined architecture is designed for real time motion recording and is flexible and modular. It allows the addition of new optimized modules for specific functions, as an interface to virtual and augmented reality headsets, different cameras sets and markerless motion capture, taking advantage of the existing ones. In order to obtain quantitative results to assess the software and the created motion capture workflow, OpenMoCap was compared with a commercial optical motion capture system. The performance of the three most used trackers was also evaluated with OpenMoCap The present version of the software is available to download at GitHub. In this paper, the architecture, construction modules and performance comparison are discussed with the scope to present the software as a choice to input motion data to virtual and augmented reality applications.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125960442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Technologies Integration of Immersive Virtual Reality on Smartphones with Real-Time Motion Capture","authors":"Marlon Dantas Braga, G. Mota, R. M. D. Costa","doi":"10.1109/SVR.2016.30","DOIUrl":"https://doi.org/10.1109/SVR.2016.30","url":null,"abstract":"This paper aims at presenting the integration of technologies used to improve the experience of immersion in virtual reality applications with affordable devices. The application explored the Kinect motion capture and the Smartphone sensors. Also, it used an intermediary computer and a virtual reality cardboard with a Smartphone. The application provided the interaction and the avatar control through natural body movements. The technologies integration has sufficient accuracy to recognize movements from user's thorax, arms and legs.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124808904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Study for Postural Evaluation and Movement Analysis of Individuals","authors":"Claiton L. V. Lisboa, L. Nedel, Anderson Maciel","doi":"10.1109/SVR.2016.29","DOIUrl":"https://doi.org/10.1109/SVR.2016.29","url":null,"abstract":"Researchers have investigated the tracking and recognition of human postures for a long time. Firstly, with the objective of better understanding our movements and behaviors, and then to detect wrong movements and try to help people to perform better. The equipments used to track these movements were invasive, complex and expensive, but with the releasing of off-the-shelf devices, such as the Microsoft Kinect, the study of movement became accessible to everyone. This work presents a study for postural evaluation and movement analysis of individuals to help them to assess and correct their movements during training. One application based on Kinect was developed for physical exercises, more specifically, for CrossFit. We applied it for posture tracking and tested it with individuals to evaluate their posture and movements during a CrossFit session. The results indicate that the application is able to provide the athlete with similar feedback to the coach, showing that it is viable to be used in the absence of an expert. Results also show that the feedback has influence on the correction of the postures of the individuals, making them capable of identifying and correcting wrong gestures during the training. This study points out some very interesting research possibilities for devices that monitor human posture, particularly with regard to the spinal region.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116534917","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. M. Teixeira, Gutenberg Barros, V. Teichrieb, W. Correia
{"title":"3D Printing as a Means for Augmenting Existing Surfaces","authors":"J. M. Teixeira, Gutenberg Barros, V. Teichrieb, W. Correia","doi":"10.1109/SVR.2016.15","DOIUrl":"https://doi.org/10.1109/SVR.2016.15","url":null,"abstract":"Three dimensional printing has gained considerable interest lately due to the proliferation of inexpensive devices as well as open source software that drive those devices. Countless possibilities exist, but the printing pipeline remains the same, going from object modeling through CAD software to the slicing of the 3D model and its further printing. This work proposes a modification in the way 3D printing content is both modeled and printed. The input comes from 2D drawings, while the printing occurs over the same drawing used before. This way, the original paper containing the drawing is \"augmented\" by the 3D printing material, since there is a registration between both real and 3D printing coordinates. By attaching a webcam to a 3D printer, we were able to make the entire system understand the drawing by seeing it and follow its contours during the printing process. We suggest distinct application scenarios that can benefit from this promising technology, ranging from architecture professionals to visually impaired people.","PeriodicalId":444488,"journal":{"name":"2016 XVIII Symposium on Virtual and Augmented Reality (SVR)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-06-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133494355","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}