{"title":"Biometric Authentication Using the Motion of a Hand","authors":"Satoru Imura, H. Hosobe","doi":"10.1145/2983310.2989210","DOIUrl":"https://doi.org/10.1145/2983310.2989210","url":null,"abstract":"We propose a hand gesture-based spatial interaction method for biometric authentication. It supports 3D gestures that allow the user to move his/her hand without touching an input device. Using the motions of fingertips and joints as biometric data, the method improves the accuracy of authentication. We present the results of experiments, where subjects performed three types of gestures.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134433393","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Large Scale Interactive AR Display Based on a Projector-Camera System","authors":"Chun Xie, Y. Kameda, Kenji Suzuki, I. Kitahara","doi":"10.1145/2983310.2989183","DOIUrl":"https://doi.org/10.1145/2983310.2989183","url":null,"abstract":"School gymnasium, which has an important role in either physical or mental development of children, is a necessary facility for most schools. In recent years, considering the individual differences among students in terms of gender, age, developmental level or interest, many new forms of gymnasium activity have been developed to make physical education more flexible. In some cases, introducing new physical activity is accompanied by a requirement of drawing new contents on the floor of a gymnasium. Ordinary, this is done by using line-tape. However, contents created by line-tape need periodic maintenance that is costly and time-consuming. Moreover, overlapping lines for different purposes can make users confused. Furthermore, the most critical problem is that line-tape can represent only simple and static contents, thus, the variety of new physical education activity are greatly limited. This paper proposes a projection-based AR system consisting of multiple projectors and cameras to deal with the problems described above. This system is aiming to provide extension functions to traditional school gymnasium by realizing not only representation of dynamic AR contents but also interactive display on the gymnasium floor.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"162 9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129201766","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Session details: Interaction I","authors":"Barrett Ens","doi":"10.1145/3248572","DOIUrl":"https://doi.org/10.1145/3248572","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"38 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126660087","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Sharpen Your Carving Skills in Mixed Reality Space","authors":"Maho Kawagoe, M. Otsuki, F. Shibata, Asako Kimura","doi":"10.1145/2983310.2989188","DOIUrl":"https://doi.org/10.1145/2983310.2989188","url":null,"abstract":"This paper proposes a virtual carving system using ToolDevice in a mixed reality (MR) space. By touching and moving the device over real objects, users can carve it virtually. Real-world wood carving with wood carving tools requires several steps such as carving a rough outline, shaping the wood, and carving patterns on its surface. In this paper, we focus on the step of carving patterns on a surface and implement it in our MR carving system.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122599307","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"KnowWhat: Mid Field Sensemaking for the Visually Impaired","authors":"Sujeath Pareddy, A. Agarwal, Manohar Swaminathan","doi":"10.1145/2983310.2989190","DOIUrl":"https://doi.org/10.1145/2983310.2989190","url":null,"abstract":"KnowWhat is our solution to help speed up mid-field sensemaking by visually impaired persons (VIPs). Our prototype combines a spectacle mounted camera, passive fiducial marker based tagging of the environment and 3D spatial audio to build a novel interaction technique. We present qualitative results of experiments to evaluate our solution.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122752368","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"3D Camera Pose History Visualization","authors":"Mayra Donaji Barrera Machuca, W. Stuerzlinger","doi":"10.1145/2983310.2989185","DOIUrl":"https://doi.org/10.1145/2983310.2989185","url":null,"abstract":"We present a 3D camera pose history visualization that can assist users of CAD software's, virtual worlds and scientific visualizations to revisit their navigation history. The contribution of this system is to enable users to move more efficiently through the virtual environment so they can focus on their main activity.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131917299","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optimising Free Hand Selection in Large Displays by Adapting to User's Physical Movements","authors":"Xiaolong Lou, A. X. Li, Ren Peng, Preben Hansen","doi":"10.1145/2983310.2985754","DOIUrl":"https://doi.org/10.1145/2983310.2985754","url":null,"abstract":"Advance in motion sensing technologies such as Microsoft Kinect and ASUS Xtion has enabled users to select targets on a large display through natural hand gestures. In such interaction, the users move left and right to navigate the display, and they frequently adjust body proximity against the display thus to switch between overall views and focus views. These physical movements benefit information navigation, interaction modality switch, and user interface adaptation. But in more specific context of free hand selection in large displays, the effect of physical movements is less systematically investigated. To explore the potential of physical movements in free hand selection, a physical movements-adapted technique is developed and evaluated. The results show that the new technique has significant improvements in both selection efficiency and accuracy, the more difficult selection task the more obvious improvement in accuracy. Additionally, the new technique is preferred to the baseline of pointer acceleration (PA) technique by participants.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"73 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134197187","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The Reality of Mixed Reality","authors":"S. Izadi","doi":"10.1145/2983310.2983311","DOIUrl":"https://doi.org/10.1145/2983310.2983311","url":null,"abstract":"Since Ivan Sutherland's Sword of Damocles, researchers have been pushing to make augmented, virtual and mixed reality, a reality. In recent years, these technologies have exploded onto the grand stage, with many devices on the consumer market, with no apparent slowing down in terms of demand. However, whilst excitement and thirst for mixed reality technologies is at a high, there are still many challenges in making such technologies a reality for everyday consumers. In this talk, I will outline some of these challenges -- some technical, some experiential, almost all social -- and discuss how one of the key factors of taking mixed reality to the next level is around enhancing the way humans can ultimately interact and communicate. As part of this I will outline why real-time 3D capture, reconstruction and understanding of humans and the world around us is the key technology enabler in making this form of mixed reality truly ubiquitous.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"281 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134410743","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improving Interaction in HMD-Based Vehicle Simulators through Real Time Object Reconstruction","authors":"Michael Bottone, K. Johnsen","doi":"10.1145/2983310.2985761","DOIUrl":"https://doi.org/10.1145/2983310.2985761","url":null,"abstract":"Bringing real objects into the virtual world has been shown to increase usability and presence in virtual reality applications. This paper presents a system to generate a real time virtual reconstruction of real world user interface elements for use in a head mounted display based driving simulator. Our system uses sensor fusion algorithms to combine data from depth and color cameras to generate an accurate, detailed, and fast rendering of the user's hands while using the simulator. We tested our system and show in our results that the inclusion of the participants real hands, the wheel, and the shifter in the virtual environment increases the immersion, presence, and usability of the simulation. Our system can also be used to bring other real objects into the virtual world, especially when accuracy, detail, and real time updates are desired.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133448914","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Kruijff, Alexander Marquardt, Christina Trepkowski, R. Lindeman, André Hinkenjann, Jens Maiero, B. Riecke
{"title":"On Your Feet!: Enhancing Vection in Leaning-Based Interfaces through Multisensory Stimuli","authors":"E. Kruijff, Alexander Marquardt, Christina Trepkowski, R. Lindeman, André Hinkenjann, Jens Maiero, B. Riecke","doi":"10.1145/2983310.2985759","DOIUrl":"https://doi.org/10.1145/2983310.2985759","url":null,"abstract":"When navigating larger virtual environments and computer games, natural walking is often unfeasible. Here, we investigate how alternatives such as joystick- or leaning-based locomotion interfaces (\"human joystick\") can be enhanced by adding walking-related cues following a sensory substitution approach. Using a custom-designed foot haptics system and evaluating it in a multi-part study, we show that adding walking related auditory cues (footstep sounds), visual cues (simulating bobbing head-motions from walking), and vibrotactile cues (via vibrotactile transducers and bass-shakers under participants' feet) could all enhance participants' sensation of self-motion (vection) and involement/presence. These benefits occurred similarly for seated joystick and standing leaning locomotion. Footstep sounds and vibrotactile cues also enhanced participants' self-reported ability to judge self-motion velocities and distances traveled. Compared to seated joystick control, standing leaning enhanced self-motion sensations. Combining standing leaning with a minimal walking-in-place procedure showed no benefits and reduced usability, though. Together, results highlight the potential of incorporating walking-related auditory, visual, and vibrotactile cues for improving user experience and self-motion perception in applications such as virtual reality, gaming, and tele-presence.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"152 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121795936","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}