ACM SIGGRAPH 2021 Emerging Technologies最新文献

筛选
英文 中文
Balanced Glass Design: A flavor perception changing system by controlling the center-of-gravity 平衡玻璃设计:通过控制重心来改变味觉系统
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465344
Masaharu Hirose, M. Inami
{"title":"Balanced Glass Design: A flavor perception changing system by controlling the center-of-gravity","authors":"Masaharu Hirose, M. Inami","doi":"10.1145/3450550.3465344","DOIUrl":"https://doi.org/10.1145/3450550.3465344","url":null,"abstract":"In this paper, we propose Balanced Glass Design, a system to change flavor perception. The system consists of glass-type device shifting its center of gravity in response to the user’s motion which allows drinking a beverage with a virtual perception of weight through drinking motion. We thought It’s possible to intervene in the user’s perception of flavor by displaying virtual weight perception, and so conducted experiments on weight perception and demonstrations as a user study. This paper describes the system design, the result of experiments, and comments obtained through a user study.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"44 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114483952","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
ACM SIGGRAPH 2021 Emerging Technologies ACM SIGGRAPH 2021新兴技术
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550
{"title":"ACM SIGGRAPH 2021 Emerging Technologies","authors":"","doi":"10.1145/3450550","DOIUrl":"https://doi.org/10.1145/3450550","url":null,"abstract":"","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"87 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124366685","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Reverse Pass-Through VR 反向直通VR
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465338
N. Matsuda, B. Wheelwright, J. Hegland, Douglas Lanman
{"title":"Reverse Pass-Through VR","authors":"N. Matsuda, B. Wheelwright, J. Hegland, Douglas Lanman","doi":"10.1145/3450550.3465338","DOIUrl":"https://doi.org/10.1145/3450550.3465338","url":null,"abstract":"We introduce reverse pass-through VR, wherein a three-dimensional view of the wearer’s eyes is presented to multiple outside viewers in a perspective-correct manner, with a prototype headset containing a world-facing light field display. This approach, in conjunction with existing video (forward) pass-through technology, enables more seamless interactions between people with and without headsets in social or professional contexts. Reverse pass-through VR ties together research in social telepresence and copresence, autostereoscopic displays, and facial capture to enable natural eye contact and other important non-verbal cues in a wider range of interaction scenarios.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"69 2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123117745","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
Demonstrating Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality 演示Touch&Fold:在混合现实中渲染触摸的可折叠触觉驱动器
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465340
Shan-Yuan Teng, Pengyu Li, Romain Nith, Joshua Fonseca, Pedro Lopes
{"title":"Demonstrating Touch&Fold: A Foldable Haptic Actuator for Rendering Touch in Mixed Reality","authors":"Shan-Yuan Teng, Pengyu Li, Romain Nith, Joshua Fonseca, Pedro Lopes","doi":"10.1145/3450550.3465340","DOIUrl":"https://doi.org/10.1145/3450550.3465340","url":null,"abstract":"We propose a nail-mounted foldable haptic device that provides tactile feedback to mixed reality (MR) environments by pressing against the user's fingerpad when a user touches a virtual object. What is novel in our device is that it quickly tucks away when the user interacts with real-world objects. Its design allows it to fold back on top of the user's nail when not in use, keeping the user's fingerpad free to, for instance, manipulate handheld tools and other objects while in MR. To achieve this, we engineered a wireless and self-contained haptic device, which measures 24×24×41 mm and weighs 9.5 g. Furthermore, our foldable end-effector also features a linear resonant actuator, allowing it to render not only touch contacts (i.e., pressure) but also textures (i.e., vibrations). We demonstrate how our device renders contacts with MR surfaces, buttons, low- and high-frequency textures.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128032104","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
MetamorHockey: A Projection-based Virtual Air Hockey Platform Featuring Transformable Mallet Shapes MetamorHockey:一个基于投影的虚拟空中曲棍球平台,具有可变形的槌形
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465341
Shun Ueda, S. Kagami, K. Hashimoto
{"title":"MetamorHockey: A Projection-based Virtual Air Hockey Platform Featuring Transformable Mallet Shapes","authors":"Shun Ueda, S. Kagami, K. Hashimoto","doi":"10.1145/3450550.3465341","DOIUrl":"https://doi.org/10.1145/3450550.3465341","url":null,"abstract":"We propose a novel projection-based virtual air hockey system in which not only the puck but also the mallet is displayed as an image. Being a projected image, the mallet can freely “metamorphose” into different shapes, which expands the game design beyond the original air hockey. We discuss possible scenarios with a resizable mallet, with mallet shapes defined by drawing, and with a mallet whose collision conditions can be modified. A key challenge in implementation is to minimize the latency because the direct manipulation nature of the mallet positioning imposes a higher demand on latency than the puck positioning. By using a high-speed camera and a high-speed projector running at 420 fps, a satisfactorily quick tracking became possible such that we feel a projected mallet head to be an integral part of a mallet held by hand.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"133 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132222896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Sustainable society with a touchless solution using UbiMouse under the pandemic of COVID-19 在COVID-19大流行下,使用UbiMouse的非接触式解决方案实现可持续社会
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3470532
Daisuke Akagawa, Junichi Takatsu, Ryoji Otsu, Seiichi Hayashi, Benjamin Vallet
{"title":"Sustainable society with a touchless solution using UbiMouse under the pandemic of COVID-19","authors":"Daisuke Akagawa, Junichi Takatsu, Ryoji Otsu, Seiichi Hayashi, Benjamin Vallet","doi":"10.1145/3450550.3470532","DOIUrl":"https://doi.org/10.1145/3450550.3470532","url":null,"abstract":"This paper introduces a new artificial intelligence software which is capable of controlling devices using fingers in the air. With Ubimouse, touch-panels, restaurant ordering systems, ATM systems, and etc., which are commonly used by various people in public, can be contact-less devices. These touch-less devices, especially under the harsh conditions of COVID-19, are desired to prevent infections mediated by touch devices. Also, these devices cannot be used while wearing gloves due to their touch sensing fault. Thus, in fields using gloves, there is a demand for non-contact device operation. To satisfy these demands, we have developed “UbiMouse”. This is an AI software that allows you to operate the device by moving your fingers toward the device. In the AIs in UbiMouse, a convolution model and a regression model are used to identify fingers’ features from camera footage and to estimates the position of a detected finger, respectively. We demonstrate an operation of UbiMouse without contact. Along with this operation in the air, a mouse cursor is guided to the specified location with high accuracy.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115353711","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Polarimetric Spatio-Temporal Light Transport Probing 偏振时空光传输探测
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3469737
Seung-Hwan Baek, Felix Heide
{"title":"Polarimetric Spatio-Temporal Light Transport Probing","authors":"Seung-Hwan Baek, Felix Heide","doi":"10.1145/3450550.3469737","DOIUrl":"https://doi.org/10.1145/3450550.3469737","url":null,"abstract":"Fig. 1. We propose a computational light transport probingmethod that decomposes transport into full polarization, spatial and temporal dimensions.Wemodel this multi-dimensional light transport as a tensor and analyze low-rank structure in the polarization domain which is exploited by our polarimetric probing method. We instantiate our approach with two imaging systems for spatio-polarimetric and coaxial temporal-polarimetric capture. (a)&(d) Conventional intensity imagers integrate incident light intensity over space and time independently of the polarization states of light, losing geometric and material information encoded in the polarimetric transport. Capturing polarization-resolved spatial transport components of (b) epipolar and (c) non-epipolar dimensions enable fine-grained decomposition of light transport. Combining temporal and polarimetric dimensions, we separate (e) geometry-dependent reflections and (f) direct/indirect reflections that cannot be resolved in the temporal-only measurements.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"103 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115569596","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
Demonstrating MagnetIO: Passive yet Interactive Soft Haptic Patches Anywhere 演示MagnetIO:无源交互式软触觉贴片
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465342
Alex Mazursky, Shan-Yuan Teng, Romain Nith, Pedro Lopes
{"title":"Demonstrating MagnetIO: Passive yet Interactive Soft Haptic Patches Anywhere","authors":"Alex Mazursky, Shan-Yuan Teng, Romain Nith, Pedro Lopes","doi":"10.1145/3450550.3465342","DOIUrl":"https://doi.org/10.1145/3450550.3465342","url":null,"abstract":"We demonstrate a new type of haptic actuator, which we call MagnetIO, that is comprised of two parts: one battery-powered voice-coil worn on the user's fingernail and any number of interactive soft patches that can be attached onto any surface (everyday objects, user's body, appliances, etc.). When the user's finger wearing our voice-coil contacts any of the interactive patches it detects its magnetic signature via magnetometer and vibrates the patch, adding haptic feedback to otherwise input-only interactions. To allow these passive patches to vibrate, we make them from silicone with regions doped with polarized neodymium powder, resulting in soft and stretchable magnets. This stretchable form-factor allows them to be wrapped to the user's body or everyday objects of various shapes. We demonstrate how these add haptic output to many situations, such as adding haptic buttons to the walls of one's home.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"57 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127318317","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Augmented reality representation of virtual user avatars moving in a virtual representation of the real world at their respective real world locations 增强现实表示虚拟用户化身在虚拟现实世界中移动,在他们各自的现实世界位置
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465337
Christoph Leuze, Matthias Leuze
{"title":"Augmented reality representation of virtual user avatars moving in a virtual representation of the real world at their respective real world locations","authors":"Christoph Leuze, Matthias Leuze","doi":"10.1145/3450550.3465337","DOIUrl":"https://doi.org/10.1145/3450550.3465337","url":null,"abstract":"In this work we present an augmented reality (AR) application that allows a user with an AR display to watch another user, flying an airplane in the Microsoft Flight Simulator 2020 (MSFS), at their respective location in the real world. To do that, we take the location data of a virtual 3D airplane model in a virtual representation of the world of a user playing MSFS, and stream it via a server to a mobile device. The mobile device user can then see the same 3D airplane model at exactly that real world location, that corresponds to the location of the virtual 3D airplane model in the virtual representation of the world. The mobile device user can also see the avatar movement updated according to the 3D airplane movement in the virtual world. We implemented the application on both a cellphone and a see-through headset.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130066807","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Behind The Game: Implicit Spatio-Temporal Intervention in Inter-personal Remote Physical Interactions on Playing Air Hockey 游戏背后:空气曲棍球运动中人际远程身体互动的内隐时空干预
ACM SIGGRAPH 2021 Emerging Technologies Pub Date : 2021-08-05 DOI: 10.1145/3450550.3465348
Azumi Maekawa, Hiroto Saito, Narin Okazaki, Shunichi Kasahara, M. Inami
{"title":"Behind The Game: Implicit Spatio-Temporal Intervention in Inter-personal Remote Physical Interactions on Playing Air Hockey","authors":"Azumi Maekawa, Hiroto Saito, Narin Okazaki, Shunichi Kasahara, M. Inami","doi":"10.1145/3450550.3465348","DOIUrl":"https://doi.org/10.1145/3450550.3465348","url":null,"abstract":"When playing inter-personal sports games remotely, the time lag between user actions and feedback decreases the user’s performance and sense of agency. While computational assistance can improve performance, naive intervention independent of the context also compromises the user’s sense of agency. We propose a context-aware assistance method that retrieves both user performance and sense of agency, and we demonstrate the method using air hockey (a two-dimensional physical game) as a testbed. Our system includes a 2D plotter-like machine that controls the striker on half of the table surface, and a web application interface that enables manipulation of the striker from a remote location. Using our system, a remote player can play against a physical opponent from anywhere through a web browser. We designed the striker control assistance based on the context by computationally predicting the puck’s trajectory using a real-time captured video image. With this assistance, the remote player exhibits an improved performance without compromising their sense of agency, and both players can experience the excitement of the game.","PeriodicalId":286424,"journal":{"name":"ACM SIGGRAPH 2021 Emerging Technologies","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128232338","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信