2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)最新文献

筛选
英文 中文
Immersive Captioning: Developing a framework for evaluating user needs 沉浸式字幕:开发评估用户需求的框架
Chris J. Hughes, Marta B. Zapata, Matthew Johnston, P. Orero
{"title":"Immersive Captioning: Developing a framework for evaluating user needs","authors":"Chris J. Hughes, Marta B. Zapata, Matthew Johnston, P. Orero","doi":"10.1109/AIVR50618.2020.00063","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00063","url":null,"abstract":"This article focuses on captioning for immersive environments and the research aims to identify how to display them for an optimal viewing experience. This work began four years ago with some partial findings. This second stage of research, built from the lessons learnt, focuses on the design requirements cornerstone: prototyping. A tool has been developed towards quick and realistic prototyping and testing. The framework integrates methods used in existing solutions. Given how easy it is to contrast and compare, the need to further the first framework was obvious. A second improved solution was developed, almost as a showcase on how ideas can quickly be implemented for user testing. After an overview on captions in immersive environments, the article describes its implementation, based on web technologies opening for any device with a web browser. This includes desktop computers, mobile devices and head mounted displays. The article finishes with a description of the new caption modes and methods, hoping to be a useful tool towards testing and standardisation.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129583633","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Investigating learners’ motivation towards a virtual reality learning environment: a pilot study in vehicle painting 调查学习者对虚拟现实学习环境的动机:汽车涂装的试点研究
Miriam Mulders
{"title":"Investigating learners’ motivation towards a virtual reality learning environment: a pilot study in vehicle painting","authors":"Miriam Mulders","doi":"10.1109/AIVR50618.2020.00081","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00081","url":null,"abstract":"The HandleVR project develops a Virtual Reality (VR) training based on the 4C/ID model [1] to train vocational competencies in the field of vehicle painting. The paper presents the results of a pilot study with fourteen aspirant vehicle painters who tested two prototypical tasks in VR and evaluated its suitability, i.a. regarding their learning motivation. The results indicate that VR training is highly motivating and some aspects (e.g., a virtual trainer) in particular promote motivation. Further research is needed to take advantage of these positive motivational effects to support meaningful learning.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129922814","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
SnapMove: Movement Projection Mapping in Virtual Reality SnapMove:虚拟现实中的运动投影映射
B. Cohn, A. Maselli, E. Ofek, Mar González-Franco
{"title":"SnapMove: Movement Projection Mapping in Virtual Reality","authors":"B. Cohn, A. Maselli, E. Ofek, Mar González-Franco","doi":"10.1109/AIVR50618.2020.00024","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00024","url":null,"abstract":"We present SnapMove a technique to reproject reaching movements inside Virtual Reality. SnapMove can be used to reduce the need of large, fatiguing or difficult motions. We designed multiple reprojection techniques, linear or planar, uni-manual, bi-manual or head snap, that can be used for reaching, throwing and virtual tool manipulation. In a user study (n=21) we explore if the self-avatar follower effect can be modulated depending on the cost of the motion introduced by remapping. SnapMove was successful in re-projecting user’s hand position from e.g. a lower area, to a higher avatar-hand position–a mapping which can be ideal for limiting fatigue. It was also successful in preserving avatar embodiment and gradually bring users to perform movements with higher cost energies, which have most interest for rehabilitation scenarios. We implemented applications for menu interaction, climbing, rowing, and throwing darts. Overall, SnapMove can make interactions in virtual environments easier. We discuss the potential impact of SnapMove for application in gaming, accessibility and therapy.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123730562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Rainbow Learner: Lighting Environment Estimation from a Structural-color based AR Marker 彩虹学习器:基于结构颜色的AR标记的照明环境估计
Yuji Tsukagoshi, Yuuki Uranishi, J. Orlosky, Kiyomi Ito, H. Takemura
{"title":"Rainbow Learner: Lighting Environment Estimation from a Structural-color based AR Marker","authors":"Yuji Tsukagoshi, Yuuki Uranishi, J. Orlosky, Kiyomi Ito, H. Takemura","doi":"10.1109/AIVR50618.2020.00074","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00074","url":null,"abstract":"This paper proposes a method for estimating lighting environments from an AR marker coupled with the structural color patterns inherent to a compact disc (CD) form-factor. To achieve photometric consistency, these patterns are used as input to a Conditional Generative Adversarial Network (CGAN), which allows us to efficiently and quickly generate estimations of an environment map. We construct a dataset from pairs of images of the structural color pattern and environment map captured in multiple scenes, and the CGAN is then trained with this dataset. Experiments show that we can generate visually accurate reconstructions with this method for certain scenes, and that the environment map can be estimated in real time.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121262412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Immersive Visualization of Dengue Vector Breeding Sites Extracted from Street View Images 从街景图像中提取登革热媒介滋生地点的沉浸式可视化
Mores Prachyabrued, P. Haddawy, Krittayoch Tengputtipong, Myat Su Yin, D. Bicout, Yongjua Laosiritaworn
{"title":"Immersive Visualization of Dengue Vector Breeding Sites Extracted from Street View Images","authors":"Mores Prachyabrued, P. Haddawy, Krittayoch Tengputtipong, Myat Su Yin, D. Bicout, Yongjua Laosiritaworn","doi":"10.1109/AIVR50618.2020.00016","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00016","url":null,"abstract":"Dengue is considered one of the most serious global health burdens. The primary vector of dengue is the Aedes aegypti mosquito, which has adapted to human habitats and breeds primarily in artificial containers that can contain water. Control of dengue relies on effective mosquito vector control, for which detection and mapping of potential breeding sites is essential. The two traditional approaches to this have been to use satellite images, which do not provide sufficient resolution to detect a large proportion of the breeding sites, and manual counting, which is too labor intensive to be used on a routine basis over large areas. Our recent work has addressed this problem by applying convolutional neural nets to detect outdoor containers representing potential breeding sites in Google street view images. The challenge is now not a paucity of data, but rather transforming the large volumes of data produced into meaningful information. In this paper, we present the design of an immersive visualization using a tiled-display wall that supports an early but crucial stage of dengue investigation, by enabling researchers to interactively explore and discover patterns in the datasets, which can help in forming hypotheses that can drive quantitative analyses. The tool is also useful in uncovering patterns that may be too sparse to be discovered by correlational analyses and in identifying outliers that may justify further study. We demonstrate the usefulness of our approach with two usage scenarios that lead to insights into the relationship between dengue incidence and container counts.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"38 3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133478488","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A Smartphone Thermal Temperature Analysis for Virtual and Augmented Reality 虚拟与增强现实智能手机热温度分析
Xiaoyang Zhang, Harshit Vadodaria, Na Li, K. Kang, Yao Liu
{"title":"A Smartphone Thermal Temperature Analysis for Virtual and Augmented Reality","authors":"Xiaoyang Zhang, Harshit Vadodaria, Na Li, K. Kang, Yao Liu","doi":"10.1109/AIVR50618.2020.00061","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00061","url":null,"abstract":"Emerging virtual and augmented reality applications are envisioned to significantly enhance user experiences. An important issue related to user experience is thermal management in smartphones widely adopted for virtual and augmented reality applications. Although smartphone overheating has been reported many times, a systematic measurement and analysis of their thermal behaviors is relatively scarce, especially for virtual and augmented reality applications. To address the issue, we build a temperature measurement and analysis framework for virtual and augmented reality applications using a robot, infrared cameras, and smartphones. Using the framework, we analyze a comprehensive set of data including the battery power consumption, smartphone surface temperature, and temperature of key hardware components, such as the battery, CPU, GPU,and WiFi module. When a 360° virtual reality video is streamed to a smartphone, the phone surface temperature reaches near $39^{circ} mathrm{C}$. Also, the temperature of the phone surface and its main hardware components generally increases till the end of our 20 -minute experiments despite thermal control undertaken by smartphones, such as CPU/GPU frequency scaling. Our thermal analysis results of a popular AR game are even more serious: the battery power consumption frequently exceeds the thermal design power by 20-80 %, while the peak battery, CPU, GPU, and WiFi module temperature exceeds $45,70,70$, and $65^{circ} mathrm{C}$ respectively.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"100 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131858813","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Under The (Plastic) Sea - Sensitizing People Toward Ecological Behavior Using Virtual Reality Controlled by Users’ Physical Activity 在(塑料)海洋下——利用用户身体活动控制的虚拟现实使人们对生态行为敏感
Carolin Straßmann, Alexander Arntz, S. Eimler
{"title":"Under The (Plastic) Sea - Sensitizing People Toward Ecological Behavior Using Virtual Reality Controlled by Users’ Physical Activity","authors":"Carolin Straßmann, Alexander Arntz, S. Eimler","doi":"10.1109/AIVR50618.2020.00036","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00036","url":null,"abstract":"As environmental pollution continues to expand, new ways for raising awareness for the consequences need to be explored. Virtual reality has emerged as an effective tool for behavioral change. This paper investigates if virtual reality applications controlled through physical activity can support an even stronger effect, because it enhances the attention and recall performance by stimulating the working memory through motor functions. This was tested in an experimental study using a virtual reality head-mounted display in combination with the ICAROS fitness device enabling participants to explore either a plastic-polluted or non-polluted sea. Results indicated that using a regular controller elicits more presence and a more intense Flow experience than the ICAROS condition, which people controlled via their physical activity. Moreover, the plastic-polluted stimulus was more effective in inducing attitude change than a nonpolluted sea.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114782337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Exploring the possibilities of Extended Reality in the world of firefighting 在消防领域探索扩展现实的可能性
Janne Heirman, S. Selleri, Tom De Vleeschauwer, Charles Hamesse, Michel Bellemans, Evarest Schoofs, R. Haelterman
{"title":"Exploring the possibilities of Extended Reality in the world of firefighting","authors":"Janne Heirman, S. Selleri, Tom De Vleeschauwer, Charles Hamesse, Michel Bellemans, Evarest Schoofs, R. Haelterman","doi":"10.1109/AIVR50618.2020.00055","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00055","url":null,"abstract":"Firefighting is a crucial part of the Navy’s training program, as it must ensure the safety on board. This training is dangerous, expensive and environmentally unfriendly. Therefore, the Navy is looking for a safer form of training that can enhance the current one. Extended Reality technology offers new ways of training, with the promise to alleviate issues related to training danger, costs and environmental pollution. In this work, we develop and evaluate a Virtual Reality simulator and a proof of concept of a Mixed Reality simulator, together with a firehose controller adapted to the needs of the Navy’s firefighting training program.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"5 6","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120864127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
The Efficacy of a Virtual Reality-Based Mindfulness Intervention 基于虚拟现实的正念干预的效果
Caglar Yildirim, Tara O'Grady
{"title":"The Efficacy of a Virtual Reality-Based Mindfulness Intervention","authors":"Caglar Yildirim, Tara O'Grady","doi":"10.1109/AIVR50618.2020.00035","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00035","url":null,"abstract":"Mindfulness can be defined as increased awareness of and sustained attentiveness to the present moment. Recently, there has been a growing interest in the applications of mindfulness for empirical research in wellbeing and the use of virtual reality (VR) environments and 3D interfaces as a conduit for mindfulness training. Accordingly, the current experiment investigated whether a brief VR-based mindfulness intervention could induce a greater level of state mindfulness, when compared to an audio-based intervention and control group. Results indicated two mindfulness interventions, VRbased and audio-based, induced a greater state of mindfulness, compared to the control group. Participants in the VR-based mindfulness intervention group reported a greater state of mindfulness than those in the guided audio group, indicating the immersive mindfulness intervention was more robust. Collectively, these results provide empirical support for the efficaciousness of a brief VR-based mindfulness intervention in inducing a robust state of mindfulness in laboratory settings.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"54 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126684446","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
CrowdAR Table An AR system for Real-time Interactive Crowd Simulation 一种用于实时交互式人群模拟的AR系统
Noud Savenije, Roland Geraerts, Wolfgang Hürst
{"title":"CrowdAR Table An AR system for Real-time Interactive Crowd Simulation","authors":"Noud Savenije, Roland Geraerts, Wolfgang Hürst","doi":"10.1109/AIVR50618.2020.00021","DOIUrl":"https://doi.org/10.1109/AIVR50618.2020.00021","url":null,"abstract":"Spatial augmented reality, where virtual information is projected into a user’s real environment, provides tremendous opportunities for immersive analytics. In this demonstration, we focus on real-time interactive crowd simulation, that is, the illustration of how crowds move under certain circumstances. Our augmented reality system, called CrowdAR, allows users to study a crowd’s motion behavior by projecting the output of our simulation software onto an augmented reality table and objects on this table. Our prototype system is currently being revised and extended to serve as a museum exhibit. Using real-time interaction, it can teach scientific principles about simulations and illustrate how these, in combination with augmented reality, can be used for crowd behavior analysis.","PeriodicalId":348199,"journal":{"name":"2020 IEEE International Conference on Artificial Intelligence and Virtual Reality (AIVR)","volume":"145 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123778225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信