物联网媒体:通过物联网感知支持富元数据媒体的生产

Gerard Wilkinson, Tom Bartindale, Thomas Nappey, Michael Evans, Peter C. Wright, P. Olivier
{"title":"物联网媒体:通过物联网感知支持富元数据媒体的生产","authors":"Gerard Wilkinson, Tom Bartindale, Thomas Nappey, Michael Evans, Peter C. Wright, P. Olivier","doi":"10.1145/3173574.3173780","DOIUrl":null,"url":null,"abstract":"Rich metadata is becoming a key part of the broadcast production pipeline. This information can be used to deliver compelling new consumption experiences which are personalized, location-aware, interactive and multi-screen. However, media producers are struggling to generate the metadata required for such experiences, using inefficient post-production solutions which are limited in how much of the original context they can capture. In response, we present Media of Things (MoT), a tool for on-location media productions. MoT enables practical and flexible generation of sensor based point-of-capture metadata. We demonstrate how embedded ubiquitous sensing technologies such as the Internet of Things can be leveraged to produce context rich, time sequenced metadata in a production studio. We reflect on how this workflow can be integrated within the constraints of broadcast production and the possibilities that emerge from access to rich data at the beginning of the production lifecycle to produce well described media for reconfigurable consumption.","PeriodicalId":20512,"journal":{"name":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2018-04-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"9","resultStr":"{\"title\":\"Media of Things: Supporting the Production of Metadata Rich Media Through IoT Sensing\",\"authors\":\"Gerard Wilkinson, Tom Bartindale, Thomas Nappey, Michael Evans, Peter C. Wright, P. Olivier\",\"doi\":\"10.1145/3173574.3173780\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Rich metadata is becoming a key part of the broadcast production pipeline. This information can be used to deliver compelling new consumption experiences which are personalized, location-aware, interactive and multi-screen. However, media producers are struggling to generate the metadata required for such experiences, using inefficient post-production solutions which are limited in how much of the original context they can capture. In response, we present Media of Things (MoT), a tool for on-location media productions. MoT enables practical and flexible generation of sensor based point-of-capture metadata. We demonstrate how embedded ubiquitous sensing technologies such as the Internet of Things can be leveraged to produce context rich, time sequenced metadata in a production studio. We reflect on how this workflow can be integrated within the constraints of broadcast production and the possibilities that emerge from access to rich data at the beginning of the production lifecycle to produce well described media for reconfigurable consumption.\",\"PeriodicalId\":20512,\"journal\":{\"name\":\"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-04-21\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"9\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3173574.3173780\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3173574.3173780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 9

摘要

富元数据正在成为广播制作管道的关键部分。这些信息可以用来提供个性化、位置感知、互动和多屏幕的引人注目的新消费体验。然而,媒体制作人正在努力生成这种体验所需的元数据,使用低效的后期制作解决方案,这些解决方案限制了他们可以捕获多少原始环境。作为回应,我们提出了物媒(MoT),一个用于现场媒体制作的工具。MoT实现了基于传感器的捕获点元数据的实用和灵活的生成。我们演示了如何利用嵌入式无处不在的传感技术(如物联网)在制作工作室中生成上下文丰富的、时间顺序的元数据。我们思考了如何在广播制作的约束下整合这一工作流程,以及在制作生命周期开始时访问丰富数据所产生的可能性,从而为可重构消费提供良好描述的媒体。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Media of Things: Supporting the Production of Metadata Rich Media Through IoT Sensing
Rich metadata is becoming a key part of the broadcast production pipeline. This information can be used to deliver compelling new consumption experiences which are personalized, location-aware, interactive and multi-screen. However, media producers are struggling to generate the metadata required for such experiences, using inefficient post-production solutions which are limited in how much of the original context they can capture. In response, we present Media of Things (MoT), a tool for on-location media productions. MoT enables practical and flexible generation of sensor based point-of-capture metadata. We demonstrate how embedded ubiquitous sensing technologies such as the Internet of Things can be leveraged to produce context rich, time sequenced metadata in a production studio. We reflect on how this workflow can be integrated within the constraints of broadcast production and the possibilities that emerge from access to rich data at the beginning of the production lifecycle to produce well described media for reconfigurable consumption.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信