{"title":"基于360°RGBD视频的交互式混合现实渲染全景光线跟踪。","authors":"Jian Wu, Lili Wang","doi":"10.1109/MCG.2023.3327383","DOIUrl":null,"url":null,"abstract":"<p><p>This article presents an interactive panoramic ray tracing method for rendering real-time realistic lighting and shadow effects when virtual objects are inserted in 360$^{\\circ }$∘ RGBD videos. First, we approximate the geometry of the real scene. We propose a sparse sampling ray generation method to speed up the tracing process by reducing the number of rays that need to be emitted in ray tracing. After that, an irradiance estimation channel is introduced to generate noisy Monte Carlo images. Finally, the final result is smoothed and synthesized by interpolation, temporal filtering, and differential rendering. We tested our method in a number of natural and synthesized scenes and compared our method with results from ground truth and image-based illumination methods. The results show that our method can generate visually realistic frames for dynamic virtual objects in 360$^{\\circ }$∘ RGBD videos in real time, making the rendering results more natural and believable.</p>","PeriodicalId":55026,"journal":{"name":"IEEE Computer Graphics and Applications","volume":"PP ","pages":"62-75"},"PeriodicalIF":1.7000,"publicationDate":"2024-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Panoramic Ray Tracing for Interactive Mixed Reality Rendering Based on 360° RGBD Video.\",\"authors\":\"Jian Wu, Lili Wang\",\"doi\":\"10.1109/MCG.2023.3327383\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>This article presents an interactive panoramic ray tracing method for rendering real-time realistic lighting and shadow effects when virtual objects are inserted in 360$^{\\\\circ }$∘ RGBD videos. First, we approximate the geometry of the real scene. We propose a sparse sampling ray generation method to speed up the tracing process by reducing the number of rays that need to be emitted in ray tracing. After that, an irradiance estimation channel is introduced to generate noisy Monte Carlo images. Finally, the final result is smoothed and synthesized by interpolation, temporal filtering, and differential rendering. We tested our method in a number of natural and synthesized scenes and compared our method with results from ground truth and image-based illumination methods. The results show that our method can generate visually realistic frames for dynamic virtual objects in 360$^{\\\\circ }$∘ RGBD videos in real time, making the rendering results more natural and believable.</p>\",\"PeriodicalId\":55026,\"journal\":{\"name\":\"IEEE Computer Graphics and Applications\",\"volume\":\"PP \",\"pages\":\"62-75\"},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2024-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Computer Graphics and Applications\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/MCG.2023.3327383\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/25 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Computer Graphics and Applications","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/MCG.2023.3327383","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/25 0:00:00","PubModel":"Epub","JCR":"Q3","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
Panoramic Ray Tracing for Interactive Mixed Reality Rendering Based on 360° RGBD Video.
This article presents an interactive panoramic ray tracing method for rendering real-time realistic lighting and shadow effects when virtual objects are inserted in 360$^{\circ }$∘ RGBD videos. First, we approximate the geometry of the real scene. We propose a sparse sampling ray generation method to speed up the tracing process by reducing the number of rays that need to be emitted in ray tracing. After that, an irradiance estimation channel is introduced to generate noisy Monte Carlo images. Finally, the final result is smoothed and synthesized by interpolation, temporal filtering, and differential rendering. We tested our method in a number of natural and synthesized scenes and compared our method with results from ground truth and image-based illumination methods. The results show that our method can generate visually realistic frames for dynamic virtual objects in 360$^{\circ }$∘ RGBD videos in real time, making the rendering results more natural and believable.
期刊介绍:
IEEE Computer Graphics and Applications (CG&A) bridges the theory and practice of computer graphics, visualization, virtual and augmented reality, and HCI. From specific algorithms to full system implementations, CG&A offers a unique combination of peer-reviewed feature articles and informal departments. Theme issues guest edited by leading researchers in their fields track the latest developments and trends in computer-generated graphical content, while tutorials and surveys provide a broad overview of interesting and timely topics. Regular departments further explore the core areas of graphics as well as extend into topics such as usability, education, history, and opinion. Each issue, the story of our cover focuses on creative applications of the technology by an artist or designer. Published six times a year, CG&A is indispensable reading for people working at the leading edge of computer-generated graphics technology and its applications in everything from business to the arts.