Bingyu Li, Ines Said, Linda Kirova, Maria Blokhina, H. Kang
{"title":"SpArc: A VR Animating Tool at Your Fingertips","authors":"Bingyu Li, Ines Said, Linda Kirova, Maria Blokhina, H. Kang","doi":"10.1145/3489849.3489920","DOIUrl":"https://doi.org/10.1145/3489849.3489920","url":null,"abstract":"3D animation is becoming a popular form of storytelling in many fields, bringing life to games, films, and advertising. However, the complexity of conventional 3D animation software poses steep learning curves for novices. Our work aims to lower such barriers by creating a simple yet immersive interface that users can easily interact with. Based on the focus-group interviews, we identified key functionalities in animation workflows. The resulting tool, SpArc, is designed for two-handed setups and allows users to dive into animating without complex rigging and skinning process or learning multiple menu interactions. Instead of using a conventional horizontal slider, we designed a radial time slider to reduce possible arm fatigue and enhance the accuracy of keyframe selection. The demo will showcase this interactive 3D animation tool.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127115075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"An Interactive Flight Operation with 2-DOF Motion Platform","authors":"Riku Fukuyama, Wataru Wakita","doi":"10.1145/3489849.3489911","DOIUrl":"https://doi.org/10.1145/3489849.3489911","url":null,"abstract":"We propose an interactive flight operation with 2-DOF motion platform that enables user to tilt greatly according to the user posture and VR environment. In order to realize a flight like a hang glider, this work interactively controls the motion platform according to the attitude of the user. By tilting the body back and forth and left and right while keeping the body horizontal based on a posture like the planche exercise, the virtual aircraft tilts in that direction and the motion platform also rolling movements. In addition, since our motion platform with the balance board swings by rolling motion, it is possible to realize a large swing at low-cost and safely.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123270790","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A System for Practicing Ball/Strike Judgment in VR Environment","authors":"Kentarou Yanase, Shunji Muto, Kyohei Masuko, Tomoyuki Nagami, Takashi Ijiri","doi":"10.1145/3489849.3489931","DOIUrl":"https://doi.org/10.1145/3489849.3489931","url":null,"abstract":"The purpose of this study is to develop an easy-to-use ball/strike judgment practice system for inexperienced baseball umpires. The main idea is to provide a practice environment in a Virtual Reality (VR) space. With our system, users observe a pitched ball, perform ball/strike judgment, and review their judgment in a VR space. Since the whole process is completed in VR, users can practice the judgments without preparing a pitcher and catcher. A user investigation in which participants practiced with our system and judged balls thrown by a pitching machine was conducted. The participants responded positively when asked about the usefulness of our system.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117315186","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Enabling Robot-assisted Motion Capture with Human Scale Tracking Optimization","authors":"Pascal Chiu, Jiawei Huang, Y. Kitamura","doi":"10.1145/3489849.3489881","DOIUrl":"https://doi.org/10.1145/3489849.3489881","url":null,"abstract":"Motion tracking systems with viewpoint concerns or whose marker data include unreliable states have proven difficult to use despite many impactful benefits. We propose a technique inspired by active vision and using a customized hill-climbing approach to control a robot-sensor setup and apply it to a magnetic induction system capable of occlusion-free motion tracking. Our solution reduces the impact of displacement and orientation issues for markers which inherently present a dead-angle range that disturbs usability and accuracy. The resulting interface is successful in stabilizing previously unexploitable data while preventing sub-optimal states for up to hundreds of occurrences per recording and featuring an approximate 40% decrease in tracking error.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114985078","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modeling Pointing for 3D Target Selection in VR","authors":"Tor-Salve Dalsgaard, Jarrod Knibbe, Joanna Bergström","doi":"10.1145/3489849.3489853","DOIUrl":"https://doi.org/10.1145/3489849.3489853","url":null,"abstract":"Virtual reality (VR) allows users to interact similarly to how they do in the physical world, such as touching, moving, and pointing at objects. To select objects at a distance, most VR techniques rely on casting a ray through one or two points located on the user’s body (e.g., on the head and a finger), and placing a cursor on that ray. However, previous studies show that such rays do not help users achieve optimal pointing accuracy nor correspond to how they would naturally point. We seek to find features, which would best describe natural pointing at distant targets. We collect motion data from seven locations on the hand, arm, and body, while participants point at 27 targets across a virtual room. We evaluate the features of pointing and analyse sets of those for predicting pointing targets. Our analysis shows an 87% classification accuracy between the 27 targets for the best feature set and a mean distance of 23.56 cm in predicting pointing targets across the room. The feature sets can inform the design of more natural and effective VR pointing techniques for distant object selection.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125357758","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Syunsuke Yoshida, Makoto Sei, A. Utsumi, H. Yamazoe
{"title":"Preliminary analysis of visual cognition estimation in VR toward effective assistance timing for iterative visual search tasks","authors":"Syunsuke Yoshida, Makoto Sei, A. Utsumi, H. Yamazoe","doi":"10.1145/3489849.3489954","DOIUrl":"https://doi.org/10.1145/3489849.3489954","url":null,"abstract":"This research aims to develop a method to assist iterative visual search tasks, and it focuses on visual cognition to achieve effective assistance. As a first step to this goal, we analyzed the participants’ gaze behaviors when they visually recognized a target in a VR environment. In the experiment, the effect of visual cognition difficulty (VCD) is considered. Analysis results show that the participants could visually recognize lower-VCD targets at an earlier timing. This suggests that VCD-based guidance may improve task performance.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128777775","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Tatsuki Takano, Kazuki Takashima, Kazuyuki Fujita, Hong Guang, Kaori Ikematsu, Y. Kitamura
{"title":"A Compact and Low-cost VR Tooth Drill Training System using Mobile HMD and Stylus Smartphone","authors":"Tatsuki Takano, Kazuki Takashima, Kazuyuki Fujita, Hong Guang, Kaori Ikematsu, Y. Kitamura","doi":"10.1145/3489849.3489933","DOIUrl":"https://doi.org/10.1145/3489849.3489933","url":null,"abstract":"Drilling teeth is a significant technique for dental learners. However, the existing VR tooth drill training simulators are physically large and costly, which cannot be used at home or classroom for a private study. This work presents a novel low-cost mobile VR dental simulator using off-the-shelf devices, including a mobile HMD and a stylus smartphone. In this system, a 3D-printed physical teeth prop is placed on an EMR stylus smartphone where its stylus tracks the tip position of a physical drill. Unlike existing solutions using haptic/force devices, our approach involving physical contact between the prop and drill tip enables the user’s natural teeth hardness sensation. The use of smartphone stylus would enable significantly more accurate drill position sensing around the teeth than HMD’s accompanying controllers. We also developed VR software to simulate tooth drilling on this setup. This demo will show how our new mobile simulator offers a realistic feeling of drilling teeth.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121274193","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Taiyo Natomi, Yasuji Kitabatake, Kazuyuki Fujita, T. Onoye, Yuichi Itoh
{"title":"An Infant-Like Device that Reproduces Hugging Sensation with Multi-Channel Haptic Feedback","authors":"Taiyo Natomi, Yasuji Kitabatake, Kazuyuki Fujita, T. Onoye, Yuichi Itoh","doi":"10.1145/3489849.3489927","DOIUrl":"https://doi.org/10.1145/3489849.3489927","url":null,"abstract":"Proximity interaction, such as hugging, plays an essential role in building relationships between parents and children. However, parents and children cannot freely interact in the neonatal intensive care unit due to visiting restrictions imposed by COVID-19. In this study, we develop a system of pseudo-proximity interaction with a remote infant through a VR headset by using an infant-like device that reproduces the haptic feedback features of the hugging sensation, such as weight, body temperature, breathing, softness, and unstable neck.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128236769","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Vu, Dinh Tung Le, Dac Dang Khoa Nguyen, Sheila Sutjipto, G. Paul
{"title":"Investigating the Effect of Sensor Data Visualization Variances in Virtual Reality","authors":"T. Vu, Dinh Tung Le, Dac Dang Khoa Nguyen, Sheila Sutjipto, G. Paul","doi":"10.1145/3489849.3489877","DOIUrl":"https://doi.org/10.1145/3489849.3489877","url":null,"abstract":"This paper investigates the effect of real-time sensor data variances on humans performing straightforward assembly tasks in a Virtual Reality-based (VR-based) training system. A VR-based training system has been developed to transfer color and depth images, and constructs colored point clouds data to represent objects in real-time. Various parameters that affect sensor data acquisition and visualization of remotely operated robots in the real-world are varied. Afterward, the associated task performance is observed. Experimental results from 12 participants performed a total of 95 VR-guided puzzle assembly tasks demonstrated that a combination of low resolution and uncolored points has the most significant effect on participants’ performance. Participants mentioned that they needed to rely upon tactile feedback when the perceptual feedback was minimal. The most insignificant parameter determined was the resolution of the data representations, which, when varied within the experimental bounds, only resulted in a 5% average change in completion time. Participants also indicated in surveys that they felt their performance had improved and frustration was reduced when provided with color information of the scene.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"75 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126980653","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ayush Bhardwaj, Junghoon Chae, Richard Huynh Noeske, Jin Ryong Kim
{"title":"TangibleData: Interactive Data Visualization with Mid-Air Haptics","authors":"Ayush Bhardwaj, Junghoon Chae, Richard Huynh Noeske, Jin Ryong Kim","doi":"10.1145/3489849.3489890","DOIUrl":"https://doi.org/10.1145/3489849.3489890","url":null,"abstract":"In this paper, we investigate the effects of mid-air haptics in interactive 3D data visualization. We build an interactive 3D data visualization tool that adapts hand gestures and mid-air haptics to provide tangible interaction in VR using ultrasound haptic feedback on 3D data visualization. We consider two types of 3D visualization datasets and provide different data encoding methods for haptic representations. Two user experiments are conducted to evaluate the effectiveness of our approach. The first experimental results show that adding a mid-air haptic modality can be beneficial regardless of noise conditions and useful for handling occlusion or discerning density and volume information. The second experiment results further show the strengths and weaknesses of direct touch and indirect touch modes. Our findings can shed light on designing and implementing a tangible interaction on 3D data visualization with mid-air haptic feedback.","PeriodicalId":345527,"journal":{"name":"Proceedings of the 27th ACM Symposium on Virtual Reality Software and Technology","volume":"55 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133744810","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}