Nicha Vanichvoranun, Bowon Kim, Dooyoung Kim, Jeongmi Lee, Sang Ho Yoon, Woontack Woo
{"title":"Stretchy: Enhancing Object Sensation Through Multisensory Feedback and Muscle Input","authors":"Nicha Vanichvoranun, Bowon Kim, Dooyoung Kim, Jeongmi Lee, Sang Ho Yoon, Woontack Woo","doi":"10.1109/VRW58643.2023.00195","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00195","url":null,"abstract":"Current works on 3D interaction methods mainly focus on rigid object manipulation and selection, while very few have been done on elastic object interaction. Therefore, we suggest a novel interaction method to observe and manipulate virtual fabric in a VR environ-ment. We use multi-sensory pseudo-haptic feedback (a combination of tactile and visual feedback) and muscle strength data (EMG) to perceive the stiffness of the virtual fabric and flexible objects. For demonstration, we make fabric patches with various stiffness, and the stiffness difference can be distinguished. Our system can be implemented in a virtual cloth store to give consumers information about product stiffness and texture.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116243975","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. Stacchio, Vincenzo Armandi, L. Donatiello, G. Marfia
{"title":"AnnHoloTator: A Mixed Reality Collaborative Platform for Manufacturing Work Instruction Interaction","authors":"L. Stacchio, Vincenzo Armandi, L. Donatiello, G. Marfia","doi":"10.1109/VRW58643.2023.00091","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00091","url":null,"abstract":"Admittedly, eXtend Reality (XR) technologies may play a key role in the digital transformation sparked by the Industry 4.0 initiative. Within the virtuality continuum, Augmented and Mixed Reality (AR, MR) emerge as disruptive technologies as they can improve industrial processes while maximizing worker efficiency and mobility by providing virtual elements in real-world scenarios where and when necessary. In particular, such technologies can foster the adoption of Lean Manufacturing (LM) paradigms supporting on-site assembly processes to improve productivity. We here describe how such an approach has been implemented in a real use case, with the development of AnnHoloTator, a collaborative mixed-reality platform for Microsoft Hololens 2 that allows users to visualize and manipulate digital documents directly in AR. Unlike existing systems, AnnHoloTator lets its users explore and modify the content of the digital documents stored in the cloud by interacting with their holograms. The system provides a data-flexible approach, considering that documents reside on a remote server, where also the majority of computation is performed. This provides an advantage: on-site professionals can directly annotate a virtual document and share it with other workers.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116303066","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Kalliopi Apostolou, Marios Charalambous, Stela Makri, Panayiotis Charalambous, F. Liarokapis
{"title":"The Funneling Effect: A prototype implementation of an illusory sense of touch in virtual reality","authors":"Kalliopi Apostolou, Marios Charalambous, Stela Makri, Panayiotis Charalambous, F. Liarokapis","doi":"10.1109/VRW58643.2023.00076","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00076","url":null,"abstract":"This paper implements the “funneling illusion” in immersive virtual reality. The aim is to create a richer user experience by incorporating multiple senses, as well as, improving the sense of presence and sense of body ownership. In the future, we will experimentally test the hypothesis that vibrotactile stimuli delivered by two separate handheld VR controllers can elicit an “out-of-the-body” illusory sense of touch in virtual reality.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114852896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ubiq-Genie: Leveraging External Frameworks for Enhanced Social VR Experiences","authors":"Nels Numan, D. Giunchi, Ben J. Congdon, A. Steed","doi":"10.1109/VRW58643.2023.00108","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00108","url":null,"abstract":"This paper describes the Ubiq-Genie framework for integrating external frameworks with the Ubiq social VR platform. The proposed architecture is modular, allowing for easy integration of services and providing mechanisms to offload computationally intensive processes to a server. To showcase the capabilities of the framework, we present two prototype applications: 1) a voice- and gesture-controlled texture generation method based on Stable Diffusion 2.0 and 2) an embodied conversational agent based on ChatGPT. This work aims to demonstrate the potential of integrating external frameworks into social VR for the creation of new types of collaborative experiences.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"181 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124534318","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"In the Future Metaverse, What Kind of UGC do Users Need?","authors":"Yanxiang Zhang, Wenbin Hu","doi":"10.1109/VRW58643.2023.00293","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00293","url":null,"abstract":"With the COVID-19 pandemic, people's real-life interactions diminished, and the game-based metaverse platforms such as Minecraft and Roblox are on the rise. The main users of these platforms are teenagers, they generate content in a virtual environment, which can significantly increase the activity of the platform. However, the experience of User-Generated Content in the metaverse is not very good. So what kind of support do users need to improve the efficiency of generating content in the metaverse? To investigate teenage users' preferences and expectations of it, this paper interviewed 72 teenagers aged 12–22 who are familiar with the metaverse game, and distilled 4 suggestions that can help promote metaverse users to generate content.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127772816","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"User study of omnidirectional treadmill control algorithms in VR","authors":"Mathias Delahaye, R. Boulic","doi":"10.1109/VRW58643.2023.00282","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00282","url":null,"abstract":"We conducted a comparative user study of the Infinadeck [1] omnidirectional treadmill native control algorithm alongside an approach based on a state observer control scheme combining a kinematic model with error dynamics that allows users to change their walk freely. Based on the results from 22 participants, we observed that the alternative approach outperformed the native algorithm on trajectories involving a left or right turn (with radii of curvature of 0.5m, 1m, and 2m). However, there was no significant difference for straight-line trajectories, and the native approach yielded the best scores for in-place rotations.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"88 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125593226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Nan Xiang, Hai-Ning Liang, Lingyun Yu, Xiaosong Yang, J. Zhang
{"title":"MRMSim: A Framework for Mixed Reality based Microsurgery Simulation","authors":"Nan Xiang, Hai-Ning Liang, Lingyun Yu, Xiaosong Yang, J. Zhang","doi":"10.1109/VRW58643.2023.00207","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00207","url":null,"abstract":"With the rapid development of the computer technologies, virtual surgery has gained extensive attention over the past decades. In this research, we take advantage of mixed reality (MR) that creates an interactive environment where physical and digital objects coexist, and present a framework (MRMSim) for MR based microsurgery simulation. It enables users to practice microanastomosis skills with real microsurgical instruments rather than additional haptic feedback devices. Both hardware design and software development are in-cluded in this work. A prototype system is proposed to demonstrate the feasibility and applicability of our framework.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127944568","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Projection Mapping in the Light: A Preliminary Attempt to Substitute Projectors for Room Lights","authors":"Masaki Takeuchi, D. Iwai, Kosuke Sato","doi":"10.1109/VRW58643.2023.00169","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00169","url":null,"abstract":"Projection mapping (PM) in a bright room creates the problem of reducing the contrast of the projected texture because ambient lighting elevates the black level of the projection target. In this paper, we developed a pseudo-ambient lighting technique that turns off the original ambient lighting and reproduces its illumination from the projectors on surfaces other than the target. We confirmed that the proposed technique could reproduce a bright room while suppressing the contrast reduction of the projected texture on the target, which helped to improve the viewing experience.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131371669","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
J. Grandi, Jerry L. Terrell, Kadir Lofca, Carlos Ruizvalencia, Regis Kopper
{"title":"A Continuous Authentication Technique for XR Utilizing Time-Based One Time Passwords, Haptics, and Kinetic Activity","authors":"J. Grandi, Jerry L. Terrell, Kadir Lofca, Carlos Ruizvalencia, Regis Kopper","doi":"10.1109/VRW58643.2023.00322","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00322","url":null,"abstract":"Authentication in Extended Reality (XR) applications typically re-quires the user to enter a pattern or traditional password into an adapted two-dimensional UI or to enter information from outside the XR environment such as a pairing code on a mobile device. The existing solutions are far from ideal due to the inconvenience of repeatedly exiting and entering the XR environment to transfer codes, the risk associated with relying on static passwords, and the vulnerability caused by only authenticating at the start of the session. We present an authentication method developed for XR that offers robust security and an uninterrupted user experience. Our method uses a web-connected device able to generate time-based one-time passwords (TOTP) via haptics and maintain continuous authentication by tracking the user's kinetic activity. We refer to this theoretical device as the authentication device and emulate it for this paper using either an XR tracker or a networked microcontroller with an attached IMU.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130168511","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Exploratory Investigation into the Design of a Basketball Immersive Vision Training System","authors":"Pin-Xuan Liu, Tse-Yu Pan, Min-Chun Hu, Hung-Kuo Chu, Hsin-Shih Lin, Wen-Wei Hsieh, Chih–Jen Cheng","doi":"10.1109/VRW58643.2023.00205","DOIUrl":"https://doi.org/10.1109/VRW58643.2023.00205","url":null,"abstract":"Having good vision ability is important for basketball players who possesses the ball to efficiently search the wide-open teammates and quickly pass the ball to the one who has better chance to score according to the defenders' movements. To customize precise training scenarios for cultivating an individual athlete's vision abilities, we proposed a basketball immersive vision training system which considers not only the real-world vision training scenario in basketball team but also the concept of the optimal gaze behaviors to provide more reliable training tasks. Moreover, a four-week pilot study is also designed to evaluate whether the proposed system can train the vision ability of basketball players. The result of this pilot study shows the recruited five participants satisfied the experience of proposed training system.","PeriodicalId":412598,"journal":{"name":"2023 IEEE Conference on Virtual Reality and 3D User Interfaces Abstracts and Workshops (VRW)","volume":"148 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2023-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134411174","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}