一个远程实时神经渲染框架

Yuichi Hiroi, Yuta Itoh, J. Rekimoto
{"title":"一个远程实时神经渲染框架","authors":"Yuichi Hiroi, Yuta Itoh, J. Rekimoto","doi":"10.1145/3562939.3565616","DOIUrl":null,"url":null,"abstract":"While presenting a photorealistic appearance plays a major role in immersion in Augmented Virtuality environment, displaying that of real objects remains a challenge. Recent developments in photogrammetry have facilitated the incorporation of real objects into virtual space. However, reproducing complex appearances, such as subsurface scattering and transparency, still requires a dedicated environment for measurement and possesses a trade-off between rendering quality and frame rate. Our NeARportation framework combines server–client bidirectional communication and neural rendering to resolve these trade-offs. Neural rendering on the server receives the client’s head posture and generates a novel-view image with realistic appearance reproduction that is streamed onto the client’s display. By applying our framework to a stereoscopic display, we confirm that it can display a high-fidelity appearance on full-HD stereo videos at 35-40 frames per second (fps) according to the user’s head motion.","PeriodicalId":134843,"journal":{"name":"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology","volume":"348 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"NeARportation: A Remote Real-time Neural Rendering Framework\",\"authors\":\"Yuichi Hiroi, Yuta Itoh, J. Rekimoto\",\"doi\":\"10.1145/3562939.3565616\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"While presenting a photorealistic appearance plays a major role in immersion in Augmented Virtuality environment, displaying that of real objects remains a challenge. Recent developments in photogrammetry have facilitated the incorporation of real objects into virtual space. However, reproducing complex appearances, such as subsurface scattering and transparency, still requires a dedicated environment for measurement and possesses a trade-off between rendering quality and frame rate. Our NeARportation framework combines server–client bidirectional communication and neural rendering to resolve these trade-offs. Neural rendering on the server receives the client’s head posture and generates a novel-view image with realistic appearance reproduction that is streamed onto the client’s display. By applying our framework to a stereoscopic display, we confirm that it can display a high-fidelity appearance on full-HD stereo videos at 35-40 frames per second (fps) according to the user’s head motion.\",\"PeriodicalId\":134843,\"journal\":{\"name\":\"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology\",\"volume\":\"348 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-22\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3562939.3565616\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 28th ACM Symposium on Virtual Reality Software and Technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3562939.3565616","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

摘要

虽然在增强虚拟环境中呈现逼真的外观是沉浸感的重要组成部分,但展示真实物体的外观仍然是一个挑战。摄影测量学的最新发展促进了将真实物体纳入虚拟空间。然而,再现复杂的外观,如亚表面散射和透明度,仍然需要一个专门的测量环境,并且需要在渲染质量和帧率之间进行权衡。我们的nearport框架结合了服务器-客户端双向通信和神经渲染来解决这些权衡。服务器上的神经渲染接收客户端的头部姿势,并生成具有逼真外观再现的新视图图像,该图像流式传输到客户端的显示器上。通过将我们的框架应用于立体显示器,我们确认它可以根据用户的头部运动以每秒35-40帧(fps)的速度在全高清立体视频上显示高保真外观。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
NeARportation: A Remote Real-time Neural Rendering Framework
While presenting a photorealistic appearance plays a major role in immersion in Augmented Virtuality environment, displaying that of real objects remains a challenge. Recent developments in photogrammetry have facilitated the incorporation of real objects into virtual space. However, reproducing complex appearances, such as subsurface scattering and transparency, still requires a dedicated environment for measurement and possesses a trade-off between rendering quality and frame rate. Our NeARportation framework combines server–client bidirectional communication and neural rendering to resolve these trade-offs. Neural rendering on the server receives the client’s head posture and generates a novel-view image with realistic appearance reproduction that is streamed onto the client’s display. By applying our framework to a stereoscopic display, we confirm that it can display a high-fidelity appearance on full-HD stereo videos at 35-40 frames per second (fps) according to the user’s head motion.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信