Proceedings of the 2016 Symposium on Spatial User Interaction最新文献

筛选
英文 中文
Session details: Panel 会议详情:
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/3248577
Aitor Rovira
{"title":"Session details: Panel","authors":"Aitor Rovira","doi":"10.1145/3248577","DOIUrl":"https://doi.org/10.1145/3248577","url":null,"abstract":"","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130000217","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
KnowHow: Contextual Audio-Assistance for the Visually Impaired in Performing Everyday Tasks 知识诀窍:视障人士在日常工作中的情境性视听协助
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989196
A. Agarwal, Sujeath Pareddy, Swaminathan Manohar
{"title":"KnowHow: Contextual Audio-Assistance for the Visually Impaired in Performing Everyday Tasks","authors":"A. Agarwal, Sujeath Pareddy, Swaminathan Manohar","doi":"10.1145/2983310.2989196","DOIUrl":"https://doi.org/10.1145/2983310.2989196","url":null,"abstract":"We present a device for visually impaired persons (VIPs) that delivers contextual audio assistance for physical objects and tasks. In initial observations, we found ubiquitous use of audio-assistance technologies by VIPs for interacting with computing devices, such as Android TalkBack. However, we also saw that devices without screens frequently lack accessibility features. Our solution allows a VIP to obtain audio assistance in the presence of an arbitrary physical interface or object through a chest-mounted device. On-board are camera sensors that point towards the user's personal front-facing grasping region. Upon detecting certain gestures such as picking up an object, the device provides helpful contextual audio information to the user. Textual interfaces can be read aloud by sliding a finger over the surface of the object, allowing the user to hear a document or receive audio guidance for non-assistively-enabled electronic devices. The user may provide questions verbally in order to refine their audio assistance, or to ask broad questions about their environment. Our motivation is to provide sensemaking faculties that creatively approximate those of non-VIPs in tasks that make VIPs ineligible for common employment opportunities.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"58 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125328175","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Real-time Sign Language Recognition with Guided Deep Convolutional Neural Networks 基于深度卷积神经网络的实时手语识别
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989187
Zhengzhe Liu, Fuyang Huang, G. Tang, F. Sze, J. Qin, Xiaogang Wang, Qiang Xu
{"title":"Real-time Sign Language Recognition with Guided Deep Convolutional Neural Networks","authors":"Zhengzhe Liu, Fuyang Huang, G. Tang, F. Sze, J. Qin, Xiaogang Wang, Qiang Xu","doi":"10.1145/2983310.2989187","DOIUrl":"https://doi.org/10.1145/2983310.2989187","url":null,"abstract":"We develop a real-time, robust and accurate sign language recognition system leveraging deep convolutional neural networks(DCNN). Our framework is able to prevent common problems such as error accumulation of existing frameworks and it outperforms state-of-the-art frameworks in terms of accuracy, recognition time and usability.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129939384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 10
Desktop Orbital Camera Motions Using Rotational Head Movements 桌面轨道相机运动使用旋转头部运动
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985758
Thibaut Jacob, G. Bailly, É. Lecolinet, Géry Casiez, M. Teyssier
{"title":"Desktop Orbital Camera Motions Using Rotational Head Movements","authors":"Thibaut Jacob, G. Bailly, É. Lecolinet, Géry Casiez, M. Teyssier","doi":"10.1145/2983310.2985758","DOIUrl":"https://doi.org/10.1145/2983310.2985758","url":null,"abstract":"In this paper, we investigate how head movements can serve to change the viewpoint in 3D applications, especially when the viewpoint needs to be changed quickly and temporarily to disambiguate the view. We study how to use yaw and roll head movements to perform orbital camera control, i.e., to rotate the camera around a specific point in the scene. We report on four user studies. Study 1 evaluates the useful resolution of head movements. Study 2 informs about visual and physical comfort. Study 3 compares two interaction techniques, designed by taking into account the results of the two previous studies. Results show that head roll is more efficient than head yaw for orbital camera control when interacting with a screen. Finally, Study 4 compares head roll with a standard technique relying on the mouse and the keyboard. Moreover, users were allowed to use both techniques at their convenience in a second stage. Results show that users prefer and are faster (14.5%) with the head control technique.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124489647","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Inducing Body-Transfer Illusions in VR by Providing Brief Phases of Visual-Tactile Stimulation 通过提供短暂的视触觉刺激来诱导VR中的身体转移幻觉
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985760
Oscar Ariza, J. Freiwald, Nadine Laage, M. Feist, Mariam Salloum, G. Bruder, Frank Steinicke
{"title":"Inducing Body-Transfer Illusions in VR by Providing Brief Phases of Visual-Tactile Stimulation","authors":"Oscar Ariza, J. Freiwald, Nadine Laage, M. Feist, Mariam Salloum, G. Bruder, Frank Steinicke","doi":"10.1145/2983310.2985760","DOIUrl":"https://doi.org/10.1145/2983310.2985760","url":null,"abstract":"Current developments in the area of virtual reality (VR) allow numerous users to experience immersive virtual environments (VEs) in a broad range of application fields. In the same way, some research has shown novel advances in wearable devices to provide vibrotactile feedback which can be combined with low-cost technology for hand tracking and gestures recognition. The combination of these technologies can be used to investigate interesting psychological illusions. For instance, body-transfer illusions, such as the rubber-hand illusion or elongated-arm illusion, have shown that it is possible to give a person the persistent illusion of body transfer after only brief phases of synchronized visual-haptic stimulation. The motivation of this paper is to induce such perceptual illusions by combining VR, vibrotactile and tracking technologies, offering an interesting way to create new spatial interaction experiences centered on the senses of sight and touch. We present a technology framework that includes a pair of self-made gloves featuring vibrotactile feedback that can be synchronized with audio-visual stimulation in order to reproduce body-transfer illusions in VR. We present in detail the implementation of the framework and show that the proposed technology setup is able to induce the elongated-arm illusion providing automatic tactile stimuli, instead of the traditional approach based on manually synchronized stimulation.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121088527","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Touching the Sphere: Leveraging Joint-Centered Kinespheres for Spatial User Interaction 触摸球体:利用关节为中心的Kinespheres空间用户交互
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985753
Paul Lubos, G. Bruder, Oscar Ariza, Frank Steinicke
{"title":"Touching the Sphere: Leveraging Joint-Centered Kinespheres for Spatial User Interaction","authors":"Paul Lubos, G. Bruder, Oscar Ariza, Frank Steinicke","doi":"10.1145/2983310.2985753","DOIUrl":"https://doi.org/10.1145/2983310.2985753","url":null,"abstract":"Designing spatial user interfaces for virtual reality (VR) applications that are intuitive, comfortable and easy to use while at the same time providing high task performance is a challenging task. This challenge is even harder to solve since perception and action in immersive virtual environments differ significantly from the real world, causing natural user interfaces to elicit a dissociation of perceptual and motor space as well as levels of discomfort and fatigue unknown in the real world. In this paper, we present and evaluate the novel method to leverage joint-centered kinespheres for interactive spatial applications. We introduce kinespheres within arm's reach that envelope the reachable space for each joint such as shoulder, elbow or wrist, thus defining 3D interactive volumes with the boundaries given by 2D manifolds. We present a Fitts' Law experiment in which we evaluated the spatial touch performance on the inside and on the boundary of the main joint-centered kinespheres. Moreover, we present a confirmatory experiment in which we compared joint-centered interaction with traditional spatial head-centered menus. Finally, we discuss the advantages and limitations of placing interactive graphical elements relative to joint positions and, in particular, on the boundaries of kinespheres.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123773860","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 36
Haptic Exploration of Remote Environments with Gesture-based Collaborative Guidance 基于手势协同引导的远程环境触觉探索
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989201
Seokyeol Kim, Jinah Park
{"title":"Haptic Exploration of Remote Environments with Gesture-based Collaborative Guidance","authors":"Seokyeol Kim, Jinah Park","doi":"10.1145/2983310.2989201","DOIUrl":"https://doi.org/10.1145/2983310.2989201","url":null,"abstract":"We present a collaborative haptic interaction method for exploring a remote physical environment with guidance from a distant helper. Spatial information, which is represented by a point cloud, of the remote environment is directly rendered as a contact force without reconstruction of surfaces. On top of this, the helper can selectively exert an attractive force for reaching a target or a repulsive force for avoiding a forbidden region to the user by using free-hand gestures.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"23 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126468189","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
TickTockRay Demo: Smartwatch Raycasting for Mobile HMDs TickTockRay演示:移动头戴式显示器的智能手表Raycasting
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989206
D. Kharlamov, Krzysztof Pietroszek, Liudmila Tahai
{"title":"TickTockRay Demo: Smartwatch Raycasting for Mobile HMDs","authors":"D. Kharlamov, Krzysztof Pietroszek, Liudmila Tahai","doi":"10.1145/2983310.2989206","DOIUrl":"https://doi.org/10.1145/2983310.2989206","url":null,"abstract":"We demonstrate TickTockRay, an implementation of fixed-origin raycasting technique that utilizes a smartwatch as an input device. We show that a smartwatch-based raycasting is a good alternative to a head-rotation-controlled cursor or a specialized input device. TickTockRay implements fixed-origin raycasting with the ray originating from a fixed point, located, roughly, in the user's chest. The control-display (C/D) ratio of TickTockRay technique is set to 1, with exact correspondence between the ray and the smartwatch's rotation. Such C/D ratio enables a user to select targets in the entire virtual reality control space.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"11 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126096889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Moving Ahead with Peephole Pointing: Modelling Object Selection with Head-Worn Display Field of View Limitations 向前移动窥视孔指向:建模对象选择与头戴式显示视野限制
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2985756
Barrett Ens, David Ahlström, Pourang Irani
{"title":"Moving Ahead with Peephole Pointing: Modelling Object Selection with Head-Worn Display Field of View Limitations","authors":"Barrett Ens, David Ahlström, Pourang Irani","doi":"10.1145/2983310.2985756","DOIUrl":"https://doi.org/10.1145/2983310.2985756","url":null,"abstract":"Head-worn displays (HWDs) are now becoming widely available, which will allow researchers to explore sophisticated interface designs that support rich user productivity features. In a large virtual workspace, the limited available field of view (FoV) may cause objects to be located outside of the available viewing area, requiring users to first locate an item using head motion before making a selection. However, FoV varies widely across different devices, with an unknown impact on interface usability. We present a user study to test two-step selection models previously proposed for \"peephole pointing\" in large virtual workspaces on mobile devices. Using a CAVE environment to simulate the FoV restriction of stereoscopic HWDs, we compare two different input methods, direct pointing, and raycasting in a selection task with varying FoV width. We find a very strong fit in this context, comparable to the prediction accuracy in the original studies, and much more accurate than the traditional Fitts' law model. We detect an advantage of direct pointing over raycasting, particularly with small targets. Moreover, we find that this advantage of direct pointing diminishes with decreasing FoV.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"444 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125765296","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Arm-Hidden Private Area on an Interactive Tabletop System 交互式桌面系统中的手臂隐藏私人区域
Proceedings of the 2016 Symposium on Spatial User Interaction Pub Date : 2016-10-15 DOI: 10.1145/2983310.2989194
Kai Li, Asako Kimura, F. Shibata
{"title":"Arm-Hidden Private Area on an Interactive Tabletop System","authors":"Kai Li, Asako Kimura, F. Shibata","doi":"10.1145/2983310.2989194","DOIUrl":"https://doi.org/10.1145/2983310.2989194","url":null,"abstract":"Tabletop systems are used primarily in meetings or other activities wherein information is shared. However, when confidential input is needed, for example when entering a password, privacy becomes an issue. In this study, we use the shadowed area nearby the forearm when the user places their forearm on the tabletop. And our tabletop security system is using that hidden-area to show a confidential information window. We also introduce several potential applications for this hidden-area system.","PeriodicalId":185819,"journal":{"name":"Proceedings of the 2016 Symposium on Spatial User Interaction","volume":"223 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2016-10-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133817762","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信