{"title":"MoFlow: Motion-Guided Flows for Recurrent Rendered Frame Prediction","authors":"Zhizhen Wu, Zhilong Yuan, Chenyu Zuo, Yazhen Yuan, Yifan PENG, Guiyang Pu, Rui Wang, Yuchi Huo","doi":"10.1145/3730400","DOIUrl":null,"url":null,"abstract":"Rendering realistic images in real-time on high-frame-rate display devices poses considerable challenges, even with advanced graphics cards. This stimulates a demand for frame prediction technologies to boost frame rates. The key to these algorithms is to exploit spatiotemporal coherence by warping rendered pixels with motion representations. However, existing motion estimation methods can suffer from low precision, high overhead, and incomplete support for visual effects. In this article, we present a rendered frame prediction framework with a novel motion representation, dubbed <jats:italic>motion-guided flow (MoFlow)</jats:italic> , aiming to overcome the intrinsic limitations of optical flow and motion vectors and precisely capture the dynamics of intricate geometries, lighting, and translucent objects. Notably, we construct MoFlows using a recurrent feature streaming network, which specializes in learning latent motion features from multiple frames. The results of extensive experiments demonstrate that, compared to state-of-the-art methods, our method achieves superior visual quality and temporal stability with lower latency. The recurrent mechanism allows our method to predict single or multiple consecutive frames, increasing the frame rate by over 2 ×. The proposed approach represents a flexible pipeline to meet the demands of various graphics applications, devices, and scenarios.","PeriodicalId":50913,"journal":{"name":"ACM Transactions on Graphics","volume":"10 1","pages":""},"PeriodicalIF":7.8000,"publicationDate":"2025-04-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on Graphics","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1145/3730400","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Rendering realistic images in real-time on high-frame-rate display devices poses considerable challenges, even with advanced graphics cards. This stimulates a demand for frame prediction technologies to boost frame rates. The key to these algorithms is to exploit spatiotemporal coherence by warping rendered pixels with motion representations. However, existing motion estimation methods can suffer from low precision, high overhead, and incomplete support for visual effects. In this article, we present a rendered frame prediction framework with a novel motion representation, dubbed motion-guided flow (MoFlow) , aiming to overcome the intrinsic limitations of optical flow and motion vectors and precisely capture the dynamics of intricate geometries, lighting, and translucent objects. Notably, we construct MoFlows using a recurrent feature streaming network, which specializes in learning latent motion features from multiple frames. The results of extensive experiments demonstrate that, compared to state-of-the-art methods, our method achieves superior visual quality and temporal stability with lower latency. The recurrent mechanism allows our method to predict single or multiple consecutive frames, increasing the frame rate by over 2 ×. The proposed approach represents a flexible pipeline to meet the demands of various graphics applications, devices, and scenarios.
期刊介绍:
ACM Transactions on Graphics (TOG) is a peer-reviewed scientific journal that aims to disseminate the latest findings of note in the field of computer graphics. It has been published since 1982 by the Association for Computing Machinery. Starting in 2003, all papers accepted for presentation at the annual SIGGRAPH conference are printed in a special summer issue of the journal.