2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)最新文献

筛选
英文 中文
Coretet: A 21st Century Virtual Reality Musical Instrument for Solo and Networked Ensemble Performance Coretet:二十一世纪用于独奏和网络合奏的虚拟现实乐器
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797825
Rob Hamilton
{"title":"Coretet: A 21st Century Virtual Reality Musical Instrument for Solo and Networked Ensemble Performance","authors":"Rob Hamilton","doi":"10.1109/VR.2019.8797825","DOIUrl":"https://doi.org/10.1109/VR.2019.8797825","url":null,"abstract":"Coretet is a virtual reality instrument that explores the translation of performance gesture and mechanic from traditional bowed string instruments into an inherently non-physical implementation. Built using the Unreal Engine 4 and Pure Data, Coretet offers musicians a flexible and articulate musical instrument to play as well as a networked performance environment capable of supporting and presenting a traditional four-member string quartet. This paper discusses the technical implementation of Coretet and explores the musical and performative possibilities through the translation of physical instrument design into virtual reality.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"416 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124176012","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Scale - Unexplored Opportunities for Immersive Technologies in Place-based Learning 规模——在地学习中沉浸式技术的未开发机会
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797867
Jiayan Zhao, A. Klippel
{"title":"Scale - Unexplored Opportunities for Immersive Technologies in Place-based Learning","authors":"Jiayan Zhao, A. Klippel","doi":"10.1109/VR.2019.8797867","DOIUrl":"https://doi.org/10.1109/VR.2019.8797867","url":null,"abstract":"Immersive technologies have the potential to overcome physical limitations and virtually deliver field site experiences, for example, into the classroom. Yet, little is known about the features of immersive technologies that contribute to successful place-based learning. Immersive technologies afford embodied experiences by mimicking natural embodied interactions through a user's egocentric perspective. Additionally, they allow for beyond reality experiences integrating contextual information that cannot be provided at actual field sites. The current study singles out one aspect of place-based learning: Scale. In an empirical evaluation, scale was manipulated as part of two immersive virtual field trip (iVFT) experiences in order to disentangle its effect on place-based learning. Students either attended an actual field trip (AFT) or experienced one of two iVFTs using a head-mounted display. The iVFTs either mimicked the actual field trip or provided beyond reality experiences offering access to the field site from an elevated perspective using pseudo-aerial 360° imagery. Results show that students with access to the elevated perspective had significantly better scores, for example, on their spatial situation model (SSM). Our findings provide first results on how an increased (geographic) scale, which is accessible through an elevated perspective, boosts the development of SSMs. The reported study is part of a larger immersive education effort. Inspired by the positive results, we discuss our plan for a more rigorous assessment of scale effects on both self- and objectively assessed performance measures of spatial learning.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126826561","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 23
Live Stereoscopic 3D Image with Constant Capture Direction of 360°Cameras for High-Quality Visual Telepresence 实时立体三维图像与恒定捕获方向的360°摄像机为高质量的视觉远程呈现
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797876
Y. Ikei, Vibol Yem, Kento Tashiro, Toi Fujie, Tomohiro Amemiya, M. Kitazaki
{"title":"Live Stereoscopic 3D Image with Constant Capture Direction of 360°Cameras for High-Quality Visual Telepresence","authors":"Y. Ikei, Vibol Yem, Kento Tashiro, Toi Fujie, Tomohiro Amemiya, M. Kitazaki","doi":"10.1109/VR.2019.8797876","DOIUrl":"https://doi.org/10.1109/VR.2019.8797876","url":null,"abstract":"To capture a remote 3D image, conventional stereo cameras attached to a robot head have been commonly used. However, when the head and cameras rotate, the captured image in buffers is degraded by latency and motion blur, which may cause VR sickness. In the present study, we propose a method named TwinCam in which we use two 360° cameras spaced at the standard interpupillary distance and keep the direction of the lens constant in the world coordinate even when the camera bodies are rotated to reflect the orientation of the observer's head and the position of the eyes. We consider that this method can suppress the image buffer size to send to the observer because each camera captures the omnidirectional image without lens rotation. This paper introduces the mechanical design of our camera system and its potential for visual telepresence through three experiments. Experiment 1 confirmed the requirement of a stereoscopic rather than monoscopic camera for highly accurate depth perception, and Experiments 2 and 3 proved that our mechanical camera setup can reduce motion blur and VR sickness.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126840765","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Combining Dynamic Field of View Modification with Physical Obstacle Avoidance 动态视场修正与物理避障相结合
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8798015
Fei Wu, Evan Suma Rosenberg
{"title":"Combining Dynamic Field of View Modification with Physical Obstacle Avoidance","authors":"Fei Wu, Evan Suma Rosenberg","doi":"10.1109/VR.2019.8798015","DOIUrl":"https://doi.org/10.1109/VR.2019.8798015","url":null,"abstract":"Motion sickness is a major cause of discomfort for users of virtual reality (VR) systems. Over the past several years, several techniques have been proposed to mitigate motion sickness, such as high-quality “room-scale” tracking systems, dynamic field of view modification, and displaying static or dynamic rest frames. At the same time, an absence of real world spatial cues may cause trouble during movement in virtual reality, and users may collide with physical obstacles. To address both of these problems, we propose a novel technique that combines dynamic field of view modification with rest frames generated from 3D scans of the physical environment. As the users moves, either physically and/or virtually, the displayed field of view can be artificially reduced to reveal a wireframe visualization of the real world geometry in the periphery, rendered in the same reference frame as the user. Although empirical studies have not yet been conducted, informal testing suggests that this approach is a promising method for reducing motion sickness and improving user safety at the same time.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"198 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125719870","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 16
Perception of Motion-Adaptive Color Images Displayed by a High-Speed DMD Projector 高速DMD投影仪显示运动自适应彩色图像的感知
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797850
Wakana Oshiro, S. Kagami, K. Hashimoto
{"title":"Perception of Motion-Adaptive Color Images Displayed by a High-Speed DMD Projector","authors":"Wakana Oshiro, S. Kagami, K. Hashimoto","doi":"10.1109/VR.2019.8797850","DOIUrl":"https://doi.org/10.1109/VR.2019.8797850","url":null,"abstract":"Recent progress of high-speed projectors using DMD (Digital Micromirror Device) has enabled low-latency motion adaptability of displayed images, which is a key challenge in achieving projection-based dynamic interaction systems. This paper presents evaluation of different approaches in achieving fast motion adaptability with DMD projectors through a subjective image evaluation experiment and a discrimination experiment. The results suggest that the approach proposed by the authors, which updates the image position for every binary frame instead of for every video frame, applied to 60-fps video input offers perceptual image quality comparable with the quality offered by 500-fps projection.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"624 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132181946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
EPICSAVE Lifesaving Decisions - a Collaborative VR Training Game Sketch for Paramedics EPICSAVE救生决定-医护人员的协作VR培训游戏草图
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8798365
Jonas Schild, Leonard Flock, Patrick Martens, Benjamin Roth, Niklas Schünernann, Eduard Heller, Sebastian Misztal
{"title":"EPICSAVE Lifesaving Decisions - a Collaborative VR Training Game Sketch for Paramedics","authors":"Jonas Schild, Leonard Flock, Patrick Martens, Benjamin Roth, Niklas Schünernann, Eduard Heller, Sebastian Misztal","doi":"10.1109/VR.2019.8798365","DOIUrl":"https://doi.org/10.1109/VR.2019.8798365","url":null,"abstract":"Practical, collaborative training of severe emergencies that occur too rarely within regular curricular training programs (e.g., anaphylactic shock in children patients) is difficult to realize. Multi-user virtual reality and serious game technologies can be used to provide collaborative training in dynamic settings [1], [2]. However, actual training effects seem to depend on a high presence and supportive usability [2]. EPICSAVE Lifesaving Decisions shows a novel approach that aims at further improving on these factors using an emotional scenario and collaborative game mechanics. We present a trailer video of a game sketch which creatively explores serious game design for collaborative virtual reality training systems. The game invites two paramedic trainees and one paramedic trainer into a dramatic scenario at a family theme park: A 5-year old child shows symptoms of anaphylactic shock. While the trainees begin their diagnostics procedures, a bystander, the girl's grandfather, intervenes and challenges the players' authority. Our research explores how VR game mechanics, i.e., optional narrative, authority skills and rewards, mini games, and interactive virtual characters may extend training quality and user experience over pure VR training simulations. The video exemplifies a concept that extends prior developments of a multi-user VR training simulation setup presented in [2], [3].","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"177 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121264306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
EMBRACE - a VR piece about disability and inclusion (2018) 拥抱——关于残疾和包容的VR作品(2018)
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8798138
F. Schroeder
{"title":"EMBRACE - a VR piece about disability and inclusion (2018)","authors":"F. Schroeder","doi":"10.1109/VR.2019.8798138","DOIUrl":"https://doi.org/10.1109/VR.2019.8798138","url":null,"abstract":"“Embrace” is a work created as part of a UK AHRC (Arts Humanities Research Council) funded project on Immersive and Inclusive Music Technologies. The piece is for VR headset and was developed for one of the grant's proposed outputs. The research conducted investigated how emerging technologies (such as VR) can best be adopted to suit people with different abilities (movement impaired people for example). “Embrace” allows the viewer to experience issues around disability. It tells the story about two disabled musicians (one visually impaired and one wheelchair bound) and how both experience exclusion before a concert situation. We also find out some background with regards to the nature of their disability. The work wants to stimulate the viewer to embrace difference; hence the title “Embrace”. “Embrace” is a short immersive experience about inclusion and embracing difference. It was produced at the Sonic Arts Research Centre, Queen's University Belfast as part of the AHRC/EPSRC Next Generation of Immersive Experiences Programme 2018.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"118 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116367032","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Sphere in Hand: Exploring Tangible Interaction with Immersive Spherical Visualizations 球体在手:探索与沉浸式球体可视化的有形交互
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797887
David Englmeier, Isabel Schönewald, A. Butz, Tobias Höllerer
{"title":"Sphere in Hand: Exploring Tangible Interaction with Immersive Spherical Visualizations","authors":"David Englmeier, Isabel Schönewald, A. Butz, Tobias Höllerer","doi":"10.1109/VR.2019.8797887","DOIUrl":"https://doi.org/10.1109/VR.2019.8797887","url":null,"abstract":"The emerging possibilities of data analysis and exploration in virtual reality raise the question of how users can be best supported during such interactions. Spherical visualizations allow for convenient exploration of certain types of data. Our tangible sphere, exactly aligned with the sphere visualizations shown in VR, implements a very natural way of interaction and utilizes senses and skills trained in the real world. This work is motivated by the prospect to create in VR a low-cost, tangible, robust, handheld spherical display that would be difficult or impossible to implement as a physical display. Our concept enables it to gain insights about the impact of a fully tangible embodiment of a virtual object on task performance, comprehension of patterns, and user behavior. After a description of the implementation we discuss the advantages and disadvantages of our approach, taking into account different handheld spherical displays utilizing outside and inside projection.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"362 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122343983","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Towards a Framework on Accessible and Social VR in Education 构建无障碍和社会化虚拟现实教育框架
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8798100
Anthony Scavarelli, A. Arya, Robert J. Teather
{"title":"Towards a Framework on Accessible and Social VR in Education","authors":"Anthony Scavarelli, A. Arya, Robert J. Teather","doi":"10.1109/VR.2019.8798100","DOIUrl":"https://doi.org/10.1109/VR.2019.8798100","url":null,"abstract":"In this extended abstract, we argue that for virtual reality to be a successful tool in social learning spaces (e.g. classrooms or museums) we must also look outside the virtual reality literature to provide greater focus on accessible and social collaborative content. We explore work within Computer Supported Collaborative Learning (CSCL) and social VR domains to move towards developing a design framework for socio-educational VR. We also briefly describe our work-in-progress application framework, Circles, including these features in WebVR.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123216786","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 13
Automatic Generation of Interactive 3D Characters and Scenes for Virtual Reality from a Single-Viewpoint 360-Degree Video 从单视点360度视频自动生成交互式3D角色和虚拟现实场景
2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR) Pub Date : 2019-03-23 DOI: 10.1109/VR.2019.8797969
G. D. Dinechin, Alexis Paljic
{"title":"Automatic Generation of Interactive 3D Characters and Scenes for Virtual Reality from a Single-Viewpoint 360-Degree Video","authors":"G. D. Dinechin, Alexis Paljic","doi":"10.1109/VR.2019.8797969","DOIUrl":"https://doi.org/10.1109/VR.2019.8797969","url":null,"abstract":"This work addresses the problem of using real-world data captured from a single viewpoint by a low-cost 360-degree camera to create an immersive and interactive virtual reality scene. We combine different existing state-of-the-art data enhancement methods based on pre-trained deep learning models to quickly and automatically obtain 3D scenes with animated character models from a 360-degree video. We provide details on our implementation and insight on how to adapt existing methods to 360-degree inputs. We also present the results of a user study assessing the extent to which virtual agents generated by this process are perceived as present and engaging.","PeriodicalId":315935,"journal":{"name":"2019 IEEE Conference on Virtual Reality and 3D User Interfaces (VR)","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2019-03-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124806851","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信