Resynthesizing reality: driving vivid virtual environments from sensor networks

D. D. Haddad, G. Dublon, Brian D. Mayton, S. Russell, Xiao Xiao, K. Perlin, J. Paradiso
{"title":"Resynthesizing reality: driving vivid virtual environments from sensor networks","authors":"D. D. Haddad, G. Dublon, Brian D. Mayton, S. Russell, Xiao Xiao, K. Perlin, J. Paradiso","doi":"10.1145/3084363.3085027","DOIUrl":null,"url":null,"abstract":"The rise of ubiquitous sensing enables the harvesting of massive amounts of data from the physical world. This data is often used to drive the behavior of devices, and when presented to users, it is most commonly visualized quantitatively, as graphs and charts. Another approach for the representation of sensor network data presents the data within a rich, virtual environment. These scenes can be generated based on the physical environment, and their appearance can change based on the state of sensor nodes. By freely exploring these environments, users gain a vivid, multi-modal, and experiential perspective into large, multi-dimensional datasets. This paper presents the concept of \"Resynthesizing Reality\" through a case study we have created based on a network of environmental sensors deployed at a large-scale wetland restoration site. We describe the technical implementation of our system, present techniques to visualize sensor data within the virtual environment, and discuss potential applications for such Resynthesized Realities.","PeriodicalId":163368,"journal":{"name":"ACM SIGGRAPH 2017 Talks","volume":"155 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM SIGGRAPH 2017 Talks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3084363.3085027","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

The rise of ubiquitous sensing enables the harvesting of massive amounts of data from the physical world. This data is often used to drive the behavior of devices, and when presented to users, it is most commonly visualized quantitatively, as graphs and charts. Another approach for the representation of sensor network data presents the data within a rich, virtual environment. These scenes can be generated based on the physical environment, and their appearance can change based on the state of sensor nodes. By freely exploring these environments, users gain a vivid, multi-modal, and experiential perspective into large, multi-dimensional datasets. This paper presents the concept of "Resynthesizing Reality" through a case study we have created based on a network of environmental sensors deployed at a large-scale wetland restoration site. We describe the technical implementation of our system, present techniques to visualize sensor data within the virtual environment, and discuss potential applications for such Resynthesized Realities.
重新合成现实:从传感器网络驱动生动的虚拟环境
无处不在的传感技术的兴起使得从物理世界中获取大量数据成为可能。这些数据通常用于驱动设备的行为,当呈现给用户时,它通常以图形和图表的形式进行定量可视化。另一种表示传感器网络数据的方法是在丰富的虚拟环境中呈现数据。这些场景可以根据物理环境生成,它们的外观可以根据传感器节点的状态而变化。通过自由地探索这些环境,用户可以获得一个生动的、多模态的、体验式的视角来看待大型的、多维的数据集。本文通过一个案例研究提出了“重新合成现实”的概念,该案例研究基于部署在大型湿地恢复站点的环境传感器网络。我们描述了我们系统的技术实现,介绍了在虚拟环境中可视化传感器数据的技术,并讨论了这种重新合成现实的潜在应用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信