SIGGRAPH Asia 2020 Posters最新文献

筛选
英文 中文
Digital Twin of the Australian Square Kilometre Array (ASKAP) 澳大利亚平方公里阵列(ASKAP)数字孪生体
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425462
T. Bednarz, D. Branchaud, Florence Wang, Justin Baker, M. Marquarding
{"title":"Digital Twin of the Australian Square Kilometre Array (ASKAP)","authors":"T. Bednarz, D. Branchaud, Florence Wang, Justin Baker, M. Marquarding","doi":"10.1145/3415264.3425462","DOIUrl":"https://doi.org/10.1145/3415264.3425462","url":null,"abstract":"In this work, we present the Digital Twin of the Australian Square Kilometre Array Pathfinder (ASKAP) - an extended reality framework for telescope monitoring. Currently, most of the immersive visualisation tools developed in astronomy primarily focus on educational aspects of astronomical data or concepts. We extend this paradigm, allowing complex operational network controls with the aim of combining telescope monitoring, processing and observational data into the same framework.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123118450","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
To touch or not to touch? Comparing Touch, mid-air gesture, mid-air haptics for public display in post COVID-19 society 碰还是不碰?新冠肺炎后社会公共展示的触觉、空中手势、空中触觉比较
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425438
Shaoyan Huang, Sakthi P B Ranganathan, Isaac Parsons
{"title":"To touch or not to touch? Comparing Touch, mid-air gesture, mid-air haptics for public display in post COVID-19 society","authors":"Shaoyan Huang, Sakthi P B Ranganathan, Isaac Parsons","doi":"10.1145/3415264.3425438","DOIUrl":"https://doi.org/10.1145/3415264.3425438","url":null,"abstract":"We developed a mid-air touch Mixed Reality application which combined hand tracking sensing and haptic feedback on a desktop display. We evaluated three hand interaction techniques: 1) Touch, 2) Mid Air Gesture Touch, 3) Mid Air Haptic Touch through preliminary user testing with ten adults. Results suggest the user’s willingness to use self-service devices in public places increases in the post COVID-19 world, while their concerns about the possibility of getting in touch with the virus reduces. However before the large scale deployment of this technology, accuracy and user experience design need to be improved.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114542257","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Immersive 3D Body Painting System 沉浸式3D人体彩绘系统
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425467
Yoon-Seok Choi, Soonchul Jung, Jin-Seo Kim
{"title":"Immersive 3D Body Painting System","authors":"Yoon-Seok Choi, Soonchul Jung, Jin-Seo Kim","doi":"10.1145/3415264.3425467","DOIUrl":"https://doi.org/10.1145/3415264.3425467","url":null,"abstract":"In recent virtual reality systems, users can now experience various types of content through precise interaction with virtual world by incorporating HMD’s and interacts with the virtual world by projecting the user actions in real world beyond audiovisual viewing. Recently, virtual reality technology is also actively applied in the field of arts. This paper proposes a novel immersive virtual 3D body painting system which provides various drawing tools and paint effects used in conventional body painting works for producing high-quality works by giving insights through the stages of either concept design or pre-production. We analyzed the drawing effect of airbrush and painting brush through collaboration with body painting experts and provide excellent drawing effect through GPU-based real-time rendering. Our system also provides users with the management functions such as save/load, undo they need to create works in virtual reality.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131340904","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
HEY!: Exploring Virtual Character Interaction for Immersive Storytelling via Electroencephalography 嘿!通过脑电图探索沉浸式故事叙述中的虚拟角色互动
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425447
Yi-Hsuan Tseng, Tian-Jyun Lin, Tzu-Hsuan Yang, Ping-Hsuan Han, Saiau-Yue Tsau
{"title":"HEY!: Exploring Virtual Character Interaction for Immersive Storytelling via Electroencephalography","authors":"Yi-Hsuan Tseng, Tian-Jyun Lin, Tzu-Hsuan Yang, Ping-Hsuan Han, Saiau-Yue Tsau","doi":"10.1145/3415264.3425447","DOIUrl":"https://doi.org/10.1145/3415264.3425447","url":null,"abstract":"The Virtual Reality (VR) headset has become potential equipment for immersive storytelling. However, we know limited things about the users when they are experiencing the VR context. Sometimes, the users miss the narration because they are looking around, which makes designing a compelling VR story a challenge. With the advancement of electroencephalography (EEG) in VR, the story’s rhythm or structure could dynamically change based on the audience’s brain waves to create a personal dramatic moment. In this paper, we conduct a preliminary study to investigate the potential use of a consumer-level brainwave headset, and attempts to explore virtual character interaction to enhance immersive storytelling.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"4 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128826988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Projection Mapped Gimmick Picture Book by Optical Illusion-Based Stereoscopic Vision 基于视错觉的立体视觉投影映射技巧图画书
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425470
Sayaka Toda, Hiromitsu Fujii
{"title":"Projection Mapped Gimmick Picture Book by Optical Illusion-Based Stereoscopic Vision","authors":"Sayaka Toda, Hiromitsu Fujii","doi":"10.1145/3415264.3425470","DOIUrl":"https://doi.org/10.1145/3415264.3425470","url":null,"abstract":"","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116370622","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Generation of Origami Folding Animations from 3D Point Cloud Using Latent Space Interpolation 利用潜在空间插值从三维点云生成折纸折叠动画
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425450
Chiaki Nakagaito, Takanori Nishino, K. Takeda
{"title":"Generation of Origami Folding Animations from 3D Point Cloud Using Latent Space Interpolation","authors":"Chiaki Nakagaito, Takanori Nishino, K. Takeda","doi":"10.1145/3415264.3425450","DOIUrl":"https://doi.org/10.1145/3415264.3425450","url":null,"abstract":"","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"201 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123032746","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Deep Specular Highlight Removal for Single Real-world Image 深镜面高光去除单一真实世界的图像
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425454
Zhongqi Wu, Chuanqing Zhuang, Jian Shi, Jun Xiao, Jianwei Guo
{"title":"Deep Specular Highlight Removal for Single Real-world Image","authors":"Zhongqi Wu, Chuanqing Zhuang, Jian Shi, Jun Xiao, Jianwei Guo","doi":"10.1145/3415264.3425454","DOIUrl":"https://doi.org/10.1145/3415264.3425454","url":null,"abstract":"Specular highlight removal is a challenging task. We present a novel data-driven approach for automatic specular highlight removal from a single image. To this end, we build a new dataset of real-world images for specular highlight removal with corresponding ground-truth diffuse images. Based on the dataset, we also present a specular highlight removal network by introducing the detection of specular reflections information as guidance. The experimental evaluations indicate that the proposed approach outperforms recent state-of-the-art methods.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"227 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120958572","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
The Induced Finger Movements Effect 诱导手指运动效应
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425448
Agata Marta Soccini
{"title":"The Induced Finger Movements Effect","authors":"Agata Marta Soccini","doi":"10.1145/3415264.3425448","DOIUrl":"https://doi.org/10.1145/3415264.3425448","url":null,"abstract":"The Sense of Embodiment in Virtual Reality is one of the key components to provide users with a convincing experience. Our contribution to a better understanding of the phenomenon focuses on the analysis of the motor reaction of the users to an alien finger movement. We assess quantitatively that the view of an alien movement (i.e. a movement of the self-avatar caused by an alien will) induces a finger posture variation, that we refer to as the Induced Finger Movements Effect. This only happens in case of embodiment, while in a disembodied setup the effect disappears. The principle of the investigation is being tested as a base for neuro-rehabilitation, based on the concept of inducing movements in post-stroke hemiplegic patients.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125794793","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Sound Reactive Bio-Inspired Snake Robot Simulation 声音反应仿生蛇形机器人仿真
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425439
Sriranjan Rasakatla, I. Mizuuchi, B. Indurkhya
{"title":"Sound Reactive Bio-Inspired Snake Robot Simulation","authors":"Sriranjan Rasakatla, I. Mizuuchi, B. Indurkhya","doi":"10.1145/3415264.3425439","DOIUrl":"https://doi.org/10.1145/3415264.3425439","url":null,"abstract":"We present a hardware and software framework in which we use the direction of the sound source to interact with the simulation of a snake robot. We present a gamification idea (similar to hide and seek) of how one can use the direction of the sound and develop interactive simulations in robotics and especially use the bio-inspired idea of a snake's reactive locomotion to sound. We use multiple microphones and calculate the direction of sound coming from the sound source in near real-time and make the simulation respond to it. Since a biological snake moves away from a sound source when it senses vibrations, we bio-mimic this behavior in a simulated snake robot. This idea can be used for developing games that are reactive to multiple people interacting with a computer, based on sound direction input. This is a novel interface and first of its kind presented in this paper.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"127 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131650451","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
FeatureNet: Upsampling of Point Cloud and it’s Associated Features FeatureNet:点云的上采样及其相关功能
SIGGRAPH Asia 2020 Posters Pub Date : 2020-12-04 DOI: 10.1145/3415264.3425471
Shanthika Naik, U. Mudenagudi, R. Tabib, Adarsh Jamadandi
{"title":"FeatureNet: Upsampling of Point Cloud and it’s Associated Features","authors":"Shanthika Naik, U. Mudenagudi, R. Tabib, Adarsh Jamadandi","doi":"10.1145/3415264.3425471","DOIUrl":"https://doi.org/10.1145/3415264.3425471","url":null,"abstract":"In this paper, we address the problem of 3D Point Cloud Upsampling, that is, given a set of points, the objective is to obtain denser point cloud representation. We achieve this by proposing a deep learning architecture that along with consuming point clouds directly, also accepts associated auxiliary information such as Normals and Colors and consequently upsamples them. We design a novel feature loss function to train this model. We demonstrate our work on ModelNet dataset and show consistent improvements over existing methods.","PeriodicalId":372541,"journal":{"name":"SIGGRAPH Asia 2020 Posters","volume":"293 ","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133818787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信