Proceedings of the 2020 ACM Symposium on Spatial User Interaction最新文献

筛选
英文 中文
Visuomotor Influence of Attached Robotic Neck Augmentation 附加机器人颈部增强对视觉运动的影响
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3418460
L. Shen, M. Y. Saraiji, K. Kunze, K. Minamizawa, R. Peiris
{"title":"Visuomotor Influence of Attached Robotic Neck Augmentation","authors":"L. Shen, M. Y. Saraiji, K. Kunze, K. Minamizawa, R. Peiris","doi":"10.1145/3385959.3418460","DOIUrl":"https://doi.org/10.1145/3385959.3418460","url":null,"abstract":"The combination of eye and head movements plays a major part in our visual process. The neck provides mobility for the head motion and also limits the range of visual motion in space. In this paper, a robotic neck augmentation system is designed to surmount the physical limitations of the neck. It applies in essential a visuomotor modification to the vision-neck relationship. We conducted two experiments to measure and verify the performance of the neck alternation. The multiple test results indicate the system has a positive effect to augment the motions of vision. Specifically, the robotic neck can enlarge the range of neck motion to 200%, and influence the response motion, by overall 22% less in time and 28% faster in speed.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124034244","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Arc-Type and Tilt-Type: Pen-based Immersive Text Input for Room-Scale VR 圆弧型和倾斜型:用于房间级VR的基于笔的沉浸式文本输入
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3418454
Bret Jackson, Logan B. Caraco, Zahara M Spilka
{"title":"Arc-Type and Tilt-Type: Pen-based Immersive Text Input for Room-Scale VR","authors":"Bret Jackson, Logan B. Caraco, Zahara M Spilka","doi":"10.1145/3385959.3418454","DOIUrl":"https://doi.org/10.1145/3385959.3418454","url":null,"abstract":"As immersive, room-scale virtual and augmented reality become more utilized in productive workflows, the need for fast, equally-immersive text input grows. Traditional keyboard interaction in these room-scale environments is limiting because of its predominantly-seated usage and the necessity for visual indicators of the hands and keys potentially breaking immersion. Pen-based VR/AR interaction presents an alternative immersive text input modality with high throughput. In this paper, we present two novel interfaces designed for a pen-shaped stylus that do not require the positioning of the controller within a region of space, but rather detect input from rotation and on-board buttons. Initial results show that compared with Controller Pointing text entry techniques, Tilt-Type was slower but produced fewer errors and was less physically demanding. Additional studies are needed to validate the results.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130226485","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Spatial Scale Perception for Design Tasks in Virtual Reality 虚拟现实中设计任务的空间尺度感知
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3422697
Jingjing Zhang, Ze Dong, R. Lindeman, Thammathip Piumsomboon
{"title":"Spatial Scale Perception for Design Tasks in Virtual Reality","authors":"Jingjing Zhang, Ze Dong, R. Lindeman, Thammathip Piumsomboon","doi":"10.1145/3385959.3422697","DOIUrl":"https://doi.org/10.1145/3385959.3422697","url":null,"abstract":"We present a user study exploring spatial scale perception for design tasks by simulating different levels of eye heights (EHs) and inter-pupillary distances (IPDs) in a virtual environment. The study examined two levels of spatial scale perception of two-year-old children and adults for a chair scale estimation task. This was to provide appropriate perspectives to enable a suitable estimation of the virtual object scale for the target group during the design process. We found that the disparity between the perspective taken and the target user group had a significant impact on the resulting scale of the chairs. Our key contribution is in providing evidence to support that experiencing different spatial scale perception in VR has the potential to improve the designer's understanding of the end-user's perspective during the design process.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122375097","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Exploring the Use of Olfactory Stimuli Towards Reducing Visually Induced Motion Sickness in Virtual Reality 探索在虚拟现实中使用嗅觉刺激来减少视觉引起的晕动病
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3418451
Nimesha Ranasinghe, Pravar Jain, David Tolley, Shienny Karwita Tailan, Ching-Chiuan Yen, E. Do
{"title":"Exploring the Use of Olfactory Stimuli Towards Reducing Visually Induced Motion Sickness in Virtual Reality","authors":"Nimesha Ranasinghe, Pravar Jain, David Tolley, Shienny Karwita Tailan, Ching-Chiuan Yen, E. Do","doi":"10.1145/3385959.3418451","DOIUrl":"https://doi.org/10.1145/3385959.3418451","url":null,"abstract":"Visually Induced Motion Sickness (VIMS) plagues a significant number of individuals who utilize Virtual Reality (VR) systems. Although several solutions have been proposed that aim to reduce the onset of VIMS, a reliable approach for moderating it within VR experiences has not yet been established. Here, we set the initial stage to explore the use of controlled olfactory stimuli towards reducing symptoms associated with VIMS. In this experimental study, participants perceived different olfactory stimuli while experiencing a first-person-view rollercoaster simulation using a VR Head-Mounted Display (HMD). The onsets of VIMS symptoms were analyzed using both the Simulator Sickness Questionnaire (SSQ) and the Fast Motion Sickness Scale (FMS). Notable reductions in overall SSQ and FMS scores suggest that providing a peppermint aroma reduces the severity of VIMS symptoms experienced in VR. Additional anecdotal feedback and potential future studies on using controlled olfactory stimuli to minimize the occurrence of VIMS symptoms are also discussed.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132941240","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
TanGo: Exploring Expressive Tangible Interactions on Head-Mounted Displays 探戈:在头戴式显示器上探索富有表现力的有形互动
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3418457
Ruei-Che Chang, C. Chiang, Shuo-wen Hsu, Chih-Yun Yang, Da-Yuan Huang, Bing-Yu Chen
{"title":"TanGo: Exploring Expressive Tangible Interactions on Head-Mounted Displays","authors":"Ruei-Che Chang, C. Chiang, Shuo-wen Hsu, Chih-Yun Yang, Da-Yuan Huang, Bing-Yu Chen","doi":"10.1145/3385959.3418457","DOIUrl":"https://doi.org/10.1145/3385959.3418457","url":null,"abstract":"We present TanGo, an always-available input modality on VR headset, which can be complementary to current VR accessories. TanGO is an active mechanical structure symmetrically equipped on Head-Mounted Display, enabling 3-dimensional bimanual sliding input with each degree of freedom furnished a brake system driven by micro servo generating totally 6 passive resistive force profiles. TanGo is an all-in-one structure that possess rich input and output while keeping compactness with the trade-offs between size, weight and usability. Users can actively gesture like pushing, shearing, or squeezing with specific output provided while allowing hands to rest in stationary experiences. TanGo also renders users flexibility to switch seamlessly between virtual and real usage in Augmented Reality without additional efforts and instruments. We demonstrate three applications to show the interaction space of TanGo and then discuss its limitation and show future possibilities based on preliminary user feedback.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"122 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129051603","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Augmented Reality could transform last-mile logistics 增强现实可以改变最后一英里的物流
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3422702
J. Winkel, D. Datcu, P. Buijs
{"title":"Augmented Reality could transform last-mile logistics","authors":"J. Winkel, D. Datcu, P. Buijs","doi":"10.1145/3385959.3422702","DOIUrl":"https://doi.org/10.1145/3385959.3422702","url":null,"abstract":"The rapid growth of e-commerce is challenging parcel delivery companies to meet the demand while keeping their costs low. While drivers have to meet unrealistic targets, the companies deal with a high staff turnover percentage for the last-mile. Consequently, with every turnover, the critical experience for the last-mile is lost. In this paper, we introduce an augmented-reality based system that keeps track of events during the parcel handling process for last-mile logistics. The system can register, track parcels inside the truck to the driver and project the location in augmented form. The system, which we have integrated into a mobile application, potentially reduces processing times and accelerates the training for new drivers. We discuss first-hand observations and propose further research areas. Field-tests are currently under development to verify whether the system can operate in a real-life environment.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123088562","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Hand with Sensing Sphere: Body-Centered Spatial Interactions with a Hand-Worn Spherical Camera 感应球体的手:以身体为中心的空间交互与手持球面相机
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3418450
Riku Arakawa, Azumi Maekawa, Zendai Kashino, M. Inami
{"title":"Hand with Sensing Sphere: Body-Centered Spatial Interactions with a Hand-Worn Spherical Camera","authors":"Riku Arakawa, Azumi Maekawa, Zendai Kashino, M. Inami","doi":"10.1145/3385959.3418450","DOIUrl":"https://doi.org/10.1145/3385959.3418450","url":null,"abstract":"We propose a novel body-centered interaction system making use of a spherical camera attached to a hand. Its broad and unique field of view enables an all-in-one approach to sensing multiple pieces of contextual information in hand-based spatial interactions: (i) hand location on the body surface, (ii) hand posture, (iii) hand keypoints in certain postures, and (iv) the near-hand environment. The proposed system makes use of a deep-learning approach to perform hand location and posture recognition. The proposed system is capable of achieving high hand location and posture recognition accuracy, 85.0 % and 88.9 % respectively, after collecting sufficient data and training. Our result and example demonstrations show the potential of utilizing 360° cameras for vision-based sensing in context-aware body-centered spatial interactions.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129999354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Augmented Reality for Self-Organized Logistics in Distribution 配送中自组织物流的增强现实
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3421721
D. Datcu, J. Winkel
{"title":"Augmented Reality for Self-Organized Logistics in Distribution","authors":"D. Datcu, J. Winkel","doi":"10.1145/3385959.3421721","DOIUrl":"https://doi.org/10.1145/3385959.3421721","url":null,"abstract":"We have developed an augmented-reality (AR) based system which keeps track of events during the parcel handling process for last-mile logistics. The system can retrieve and highlight in AR on user's smartphone, the location of parcels in large piles of parcels. A camera array automatically detects the parcels placed manually on the shelf by an operator. New parcels are scanned, and parcel fingerprints are generated semi-automatically. The system can detect and track the known parcels by fingerprint and can further highlight the location of the parcel using 3D visual clues, directly on the smartphone of the operator.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133917108","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Surface vs Motion Gestures for Mobile Augmented Reality 移动增强现实的表面与运动手势
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3422694
Ze Dong, Jingjing Zhang, R. Lindeman, Thammathip Piumsomboon
{"title":"Surface vs Motion Gestures for Mobile Augmented Reality","authors":"Ze Dong, Jingjing Zhang, R. Lindeman, Thammathip Piumsomboon","doi":"10.1145/3385959.3422694","DOIUrl":"https://doi.org/10.1145/3385959.3422694","url":null,"abstract":"Surface gestures have been the dominant form of interaction for mobile devices with touchscreen. From our survey of current consumer mobile Augmented Reality (AR) applications, we found that these applications also adopted the surface gestures metaphor as their interaction technique. However, we believe that designers should be able to utilize the affordance of the three-dimensional interaction space that AR provides. We compared two interaction techniques of surface and motion gestures in a Pokémon GO-liked mobile AR game that we developed. Ten participants experienced both techniques to capture three sizes of Pokémon. The comparison examined the two types of gestures in terms of accuracy, game experience, and subjective ratings on goodness, ease of use, and engagement. It was found that the motion gesture provided better engagement and game experience, while the surface gesture was more accurate and easier to use.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131020376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A framework for scalable tracking of physical objects to enhance immersive prototyping for design thinking 一个可扩展的物理对象跟踪框架,以增强设计思维的沉浸式原型
Proceedings of the 2020 ACM Symposium on Spatial User Interaction Pub Date : 2020-10-30 DOI: 10.1145/3385959.3422700
Pak Ming Fan, T. Pong
{"title":"A framework for scalable tracking of physical objects to enhance immersive prototyping for design thinking","authors":"Pak Ming Fan, T. Pong","doi":"10.1145/3385959.3422700","DOIUrl":"https://doi.org/10.1145/3385959.3422700","url":null,"abstract":"Design thinking is regarded as one of the important problem-solving skills that include a prototyping stage for early testing the feasibility of an idea. With the advancement in technologies, virtual reality is being applied to create immersive experiences in design prototyping. While tracking physical objects and visualizing the tracking in the virtual world could enhance the immersive experiences, existing tracking devices are not easy to set up and might not scale cost-effectively in multi-user scenarios. The aim of this study is, therefore, to propose a framework for using simple and low-cost hardware for tracking physical objects in a multi-user environment. A sample implementation along with demonstrations on the tracking are also presented.","PeriodicalId":157249,"journal":{"name":"Proceedings of the 2020 ACM Symposium on Spatial User Interaction","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-10-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124037658","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信