{"title":"A Real-Time Virtual-Real Fusion Rendering Framework in Cloud-Edge Environments","authors":"Yuxi Zhou, Bowen Gao, Hongxin Zhang, Wei Chen, Xiaoliang Luo, Lvchun Wang","doi":"10.1002/cav.70049","DOIUrl":null,"url":null,"abstract":"<div>\n \n <p>This paper introduces a cloud-edge collaborative framework for real-time virtual-real fusion rendering in augmented reality (AR). By integrating Visual Simultaneous Localization and Mapping (VSLAM) with Neural Radiance Fields (NeRF), the proposed method achieves high-fidelity virtual object placement and shadow synthesis in real-world scenes. The cloud server handles computationally intensive tasks, including offline NeRF-based 3D reconstruction and online illumination estimation, while edge devices perform real-time data acquisition, SLAM-based plane detection, and rendering. To enhance realism, the system employs an improved soft shadow generation technique that dynamically adjusts shadow parameters based on light source information. Experimental results across diverse indoor environments demonstrate the system's effectiveness, with consistent real-time performance, accurate illumination estimation, and high-quality shadow rendering. The proposed method reduces the computational burden on edge devices, enabling immersive AR experiences on resource-constrained hardware, such as mobile and wearable devices.</p>\n </div>","PeriodicalId":50645,"journal":{"name":"Computer Animation and Virtual Worlds","volume":"36 4","pages":""},"PeriodicalIF":0.9000,"publicationDate":"2025-07-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computer Animation and Virtual Worlds","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1002/cav.70049","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
This paper introduces a cloud-edge collaborative framework for real-time virtual-real fusion rendering in augmented reality (AR). By integrating Visual Simultaneous Localization and Mapping (VSLAM) with Neural Radiance Fields (NeRF), the proposed method achieves high-fidelity virtual object placement and shadow synthesis in real-world scenes. The cloud server handles computationally intensive tasks, including offline NeRF-based 3D reconstruction and online illumination estimation, while edge devices perform real-time data acquisition, SLAM-based plane detection, and rendering. To enhance realism, the system employs an improved soft shadow generation technique that dynamically adjusts shadow parameters based on light source information. Experimental results across diverse indoor environments demonstrate the system's effectiveness, with consistent real-time performance, accurate illumination estimation, and high-quality shadow rendering. The proposed method reduces the computational burden on edge devices, enabling immersive AR experiences on resource-constrained hardware, such as mobile and wearable devices.
期刊介绍:
With the advent of very powerful PCs and high-end graphics cards, there has been an incredible development in Virtual Worlds, real-time computer animation and simulation, games. But at the same time, new and cheaper Virtual Reality devices have appeared allowing an interaction with these real-time Virtual Worlds and even with real worlds through Augmented Reality. Three-dimensional characters, especially Virtual Humans are now of an exceptional quality, which allows to use them in the movie industry. But this is only a beginning, as with the development of Artificial Intelligence and Agent technology, these characters will become more and more autonomous and even intelligent. They will inhabit the Virtual Worlds in a Virtual Life together with animals and plants.