Proceedings of the 2016 Symposium on Spatial User Interaction最新文献

筛选
英文 中文
Improving Gestural Interaction With Augmented Cursors 通过增强光标改善手势交互
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985765
A. Dover, G. M. Poor, Darren Guinness, Alvin Jude
{"title":"Improving Gestural Interaction With Augmented Cursors","authors":"A. Dover, G. M. Poor, Darren Guinness, Alvin Jude","doi":"10.1145/2983310.2985765","DOIUrl":"https://doi.org/10.1145/2983310.2985765","url":null,"abstract":"Gesture-based interaction has become more affordable and ubiquitous as an interaction style in recent years. One issue with gestural pointing is the lack of accuracy with smaller targets. In this paper we propose that the use of augmented cursors - which has been shown to improve small target acquisition with a standard mouse - also improves small target acquisition for gestural pointing. In our study we explored the use of Bubble Lens and Bubble Cursor as a means to improve acquisition of smaller targets, and compared it with interactions without them. Our study showed that both methods significantly improved target selection. As part of our study, we also identified the parameters in configuring Bubble Cursor for optimal results.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"136 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114160145","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Coexistent Space: Collaborative Interaction in Shared 3D Space 共存空间:共享三维空间中的协同交互
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989181
Ji-Yong Lee, Joung-Huem Kwon, S. Nam, Joong-Jae Lee, Bum-Jae You
{"title":"Coexistent Space: Collaborative Interaction in Shared 3D Space","authors":"Ji-Yong Lee, Joung-Huem Kwon, S. Nam, Joong-Jae Lee, Bum-Jae You","doi":"10.1145/2983310.2989181","DOIUrl":"https://doi.org/10.1145/2983310.2989181","url":null,"abstract":"It has long been a desire of human beings to coexist in the same space together with people in remote locations, and interact with them in a natural way. There have been proposed various concepts to connect the real world with the remote world in 3D space [1, 2]. In this paper, the concept of Coexistent Space for collaborative interaction in shared 3D virtual space is proposed. Coexistent Space provides not only the feeling of coexistence as the sense of being with others, but also a shared virtual workspace for collaboration. The main feature is that the users can share a virtual whiteboard for writing interaction and manipulate shared virtual objects with the bare hand touch interaction.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125845213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Using Area Learning in Spatially-Aware Ubiquitous Environments 在空间感知的泛在环境中使用区域学习
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989198
Edwin Chan, Yuxi Wang, T. Seyed, F. Maurer
{"title":"Using Area Learning in Spatially-Aware Ubiquitous Environments","authors":"Edwin Chan, Yuxi Wang, T. Seyed, F. Maurer","doi":"10.1145/2983310.2989198","DOIUrl":"https://doi.org/10.1145/2983310.2989198","url":null,"abstract":"We propose a framework using Google's Area Learning to create ubiquitous environments with cross-device proxemic interactions. We apply the framework to the domain of Emergency Response, and discuss the benefits and initial feedback of our framework.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122714449","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Stickie: Mobile Device Supported Spatial Collaborations 便利贴:移动设备支持的空间协作
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989192
Jaskirat S. Randhawa
{"title":"Stickie: Mobile Device Supported Spatial Collaborations","authors":"Jaskirat S. Randhawa","doi":"10.1145/2983310.2989192","DOIUrl":"https://doi.org/10.1145/2983310.2989192","url":null,"abstract":"Stickie is an application that enables people to remotely perform sticky note collaborations over the Internet by utilizing two highly ubiquitous workplace devices -- smartphones & TV screens. The phone acts as an interface between the tangible and digital space. Users can leave notes from their phones onto a TV screen by placing the phone itself on any ordinary non-touchscreen display. This research constructs a novel technique for sticky note brainstorming across the Internet. It extends the interaction spatially by incorporating cues of participants' physical expressions and activities.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130353124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Empirical Method for Detecting Pointing Gestures in Recorded Lectures 课堂录音中手势检测的经验方法
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989193
Xiaojie Zha, M. Bourguet
{"title":"Empirical Method for Detecting Pointing Gestures in Recorded Lectures","authors":"Xiaojie Zha, M. Bourguet","doi":"10.1145/2983310.2989193","DOIUrl":"https://doi.org/10.1145/2983310.2989193","url":null,"abstract":"Pointing gestures performed by lecturers are important because they seem to indicate pedagogical significance. In this extended abstract, we describe a simple empirical method for detecting pointing gestures in recorded lectures captured using a depth sensor (Kinect). We first analyse component gestures; second, we assign them weights; finally, we set a threshold on the aggregated weight in order to decide if a pointing gesture has been performed.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"123 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133731263","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Mushi: A Generative Art Canvas for Kinect Based Tracking Mushi:基于Kinect跟踪的生成艺术画布
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989179
Jennifer Weiler, Sudarshan Seshasayee
{"title":"Mushi: A Generative Art Canvas for Kinect Based Tracking","authors":"Jennifer Weiler, Sudarshan Seshasayee","doi":"10.1145/2983310.2989179","DOIUrl":"https://doi.org/10.1145/2983310.2989179","url":null,"abstract":"Using modern technology in the form of body tracking and real-time image processing software, we propound to contextualize abstract art with an experiential system. Overall, our goal was to capture something as ethereal and transitory as movement and translate it into a painterly medium.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"198 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121161245","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Locomotion in Virtual Reality for Individuals with Autism Spectrum Disorder 自闭症谱系障碍患者的虚拟现实运动
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985763
Evren Bozgeyikli, A. Raij, S. Katkoori, R. Dubey
{"title":"Locomotion in Virtual Reality for Individuals with Autism Spectrum Disorder","authors":"Evren Bozgeyikli, A. Raij, S. Katkoori, R. Dubey","doi":"10.1145/2983310.2985763","DOIUrl":"https://doi.org/10.1145/2983310.2985763","url":null,"abstract":"Virtual reality (VR) has been used as an effective tool for training individuals with autism spectrum disorder (ASD). Recently there have been an increase in the number of applications developed for this purpose. One of the most important aspects of these applications is locomotion, which is an essential form of human computer interaction. Locomotion in VR has a direct effect on many aspects of user experience such as enjoyment, frustration, tiredness, motion sickness and presence. There have been many locomotion techniques proposed for VR. Most of them were designed and evaluated for neurotypical users. On the other hand, for individuals with ASD there isn't any study to our knowledge that focuses on locomotion techniques and their evaluation. In this study, eight locomotion techniques were implemented in an immersive virtual reality test environment. These eight VR locomotion techniques may be categorized as follows: three commonly used locomotion techniques (redirected walking, walk-in-place and joystick controller), two unexplored locomotion techniques (stepper machine and point & teleport) and three locomotion techniques that were selected and designed for individuals with ASD based on their common characteristics (flying, flapping and trackball controller). A user study was performed with 12 high functioning individuals with ASD. Results indicated that joystick and point & teleport techniques provided the most comfortable use for individuals with ASD, followed by walk in place and trackball. On the other hand, flying and hand flapping did not provide comfortable use for individuals with ASD.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"89 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132495348","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 54
SHIFT-Sliding and DEPTH-POP for 3D Positioning SHIFT-Sliding和DEPTH-POP用于3D定位
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985748
Junwei Sun, W. Stuerzlinger, Dmitri Shuralyov
{"title":"SHIFT-Sliding and DEPTH-POP for 3D Positioning","authors":"Junwei Sun, W. Stuerzlinger, Dmitri Shuralyov","doi":"10.1145/2983310.2985748","DOIUrl":"https://doi.org/10.1145/2983310.2985748","url":null,"abstract":"Moving objects is an important task in 3D user interfaces. We describe two new techniques for 3D positioning, designed for a mouse, but usable with other input devices. The techniques enable rapid, yet easy-to-use positioning of objects in 3D scenes. With sliding, the object follows the cursor and moves on the surfaces of the scene. Our techniques enable precise positioning of constrained objects. Sliding assumes that by default objects stay in contact with the scene's front surfaces, are always at least partially visible, and do not interpenetrate other objects. With our new Shift-Sliding method the user can override these default assumptions and lift objects into the air or make them collide with other objects. Shift-Sliding uses the local coordinate system of the surface that the object was last in contact with, which is a new form of context-dependent manipulation. We also present Depth-Pop, which maps mouse wheel actions to all object positions along the mouse ray, where the object meets the default assumptions for sliding. For efficiency, both methods use frame buffer techniques. Two user studies show that the new techniques significantly speed up common 3D positioning tasks.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"53 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116749556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Preference Between Allocentric and Egocentric 3D Manipulation in a Locally Coupled Configuration 局部耦合结构中非中心和自我中心三维操作的偏好
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-09-28 DOI: 10.1145/2983310.2985750
Paul Issartel, Lonni Besançon, F. Guéniat, Tobias Isenberg, M. Ammi
{"title":"Preference Between Allocentric and Egocentric 3D Manipulation in a Locally Coupled Configuration","authors":"Paul Issartel, Lonni Besançon, F. Guéniat, Tobias Isenberg, M. Ammi","doi":"10.1145/2983310.2985750","DOIUrl":"https://doi.org/10.1145/2983310.2985750","url":null,"abstract":"We study user preference between allocentric and egocentric 3D manipulation on mobile devices, in a configuration where the motion of the device is applied to an object displayed on the device itself. We first evaluate this preference for translations and for rotations alone, then for full 6-DOF manipulation. We also investigate the role of contextual cues by performing this experiment in different 3D scenes. Finally, we look at the specific influence of each manipulation axis. Our results provide guidelines to help interface designers select an appropriate default mapping in this locally coupled configuration.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"29 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-09-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123086554","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Proceedings of the 2016 Symposium on Spatial User Interaction 2016空间用户交互学术研讨会论文集
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 1900-01-01 DOI: 10.1145/2983310
{"title":"Proceedings of the 2016 Symposium on Spatial User Interaction","authors":"","doi":"10.1145/2983310","DOIUrl":"https://doi.org/10.1145/2983310","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"66 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126204705","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信