SuraGS:通过表面感知高斯溅射实现高效的少镜头新视图合成

IF 2.8 4区 计算机科学 Q2 COMPUTER SCIENCE, SOFTWARE ENGINEERING
Junao Shen , Tian Feng , Haojie Dong , Jinkang Ji , Xinyu Wang , Tianjia Shao
{"title":"SuraGS:通过表面感知高斯溅射实现高效的少镜头新视图合成","authors":"Junao Shen ,&nbsp;Tian Feng ,&nbsp;Haojie Dong ,&nbsp;Jinkang Ji ,&nbsp;Xinyu Wang ,&nbsp;Tianjia Shao","doi":"10.1016/j.cag.2025.104349","DOIUrl":null,"url":null,"abstract":"<div><div>Neural radiance fields (NeRF) and 3D Gaussian splatting (3DGS) have benefited the task of novel view synthesis (NVS) that is critical to real-world applications. When extreme sparsity occurs in the training views (<em>i.e.</em>, few-shot NVS), the synthesized view images may undergo a significant quality degradation due to overfitting. Although the conventional 3DGS-based methods for few-shot NVS have reached an impressive milestone in terms of synthesis quality and efficiency, they are yet to emphasize the awareness of the scene’s surface as well as the redundancy of Gaussians, which cause blurring and artifacts over complex regions in the scene. In this paper, we propose an efficient method for few-shot novel view synthesis with surface-aware Gaussian splatting (SuraGS). Specifically, we design two surface-aware strategies on surface Gaussian optimization and behind-surface Gaussian pruning. The former drives the foremost Gaussian along each ray to possess appropriate positions and orientations with respect to the scene’s surface, and the latter leverages a pruning mask to explicitly remove redundant Gaussians. In addition, we design an auxiliary training strategy based on semi-supervised learning to counteract overfitting. Experiments on different datasets demonstrate that our SuraGS outperforms state-of-the-art methods for few-shot NVS in various metrics, with substantial efficiency improvements. In particular, the proposed method can reduce the number of Gaussians by up to 75% at the inference stage and GPU memory usage by up to 65% at the training stage.</div></div>","PeriodicalId":50628,"journal":{"name":"Computers & Graphics-Uk","volume":"131 ","pages":"Article 104349"},"PeriodicalIF":2.8000,"publicationDate":"2025-08-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SuraGS: Toward efficient few-shot novel view synthesis via surface-aware Gaussian splatting\",\"authors\":\"Junao Shen ,&nbsp;Tian Feng ,&nbsp;Haojie Dong ,&nbsp;Jinkang Ji ,&nbsp;Xinyu Wang ,&nbsp;Tianjia Shao\",\"doi\":\"10.1016/j.cag.2025.104349\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Neural radiance fields (NeRF) and 3D Gaussian splatting (3DGS) have benefited the task of novel view synthesis (NVS) that is critical to real-world applications. When extreme sparsity occurs in the training views (<em>i.e.</em>, few-shot NVS), the synthesized view images may undergo a significant quality degradation due to overfitting. Although the conventional 3DGS-based methods for few-shot NVS have reached an impressive milestone in terms of synthesis quality and efficiency, they are yet to emphasize the awareness of the scene’s surface as well as the redundancy of Gaussians, which cause blurring and artifacts over complex regions in the scene. In this paper, we propose an efficient method for few-shot novel view synthesis with surface-aware Gaussian splatting (SuraGS). Specifically, we design two surface-aware strategies on surface Gaussian optimization and behind-surface Gaussian pruning. The former drives the foremost Gaussian along each ray to possess appropriate positions and orientations with respect to the scene’s surface, and the latter leverages a pruning mask to explicitly remove redundant Gaussians. In addition, we design an auxiliary training strategy based on semi-supervised learning to counteract overfitting. Experiments on different datasets demonstrate that our SuraGS outperforms state-of-the-art methods for few-shot NVS in various metrics, with substantial efficiency improvements. In particular, the proposed method can reduce the number of Gaussians by up to 75% at the inference stage and GPU memory usage by up to 65% at the training stage.</div></div>\",\"PeriodicalId\":50628,\"journal\":{\"name\":\"Computers & Graphics-Uk\",\"volume\":\"131 \",\"pages\":\"Article 104349\"},\"PeriodicalIF\":2.8000,\"publicationDate\":\"2025-08-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers & Graphics-Uk\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0097849325001906\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, SOFTWARE ENGINEERING\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers & Graphics-Uk","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0097849325001906","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, SOFTWARE ENGINEERING","Score":null,"Total":0}
引用次数: 0

摘要

神经辐射场(NeRF)和3D高斯溅射(3DGS)有利于新视图合成(NVS)的任务,这对现实世界的应用至关重要。当训练视图中出现极端稀疏性(即少量NVS)时,合成视图图像可能会由于过拟合而出现明显的质量下降。尽管传统的基于3dgs的少镜头NVS方法在合成质量和效率方面已经达到了令人印象深刻的里程碑,但它们还没有强调对场景表面的感知以及高斯的冗余,这会导致场景中复杂区域的模糊和伪影。本文提出了一种基于表面感知高斯溅射(SuraGS)的少镜头新视图合成方法。具体来说,我们设计了两种表面感知策略:表面高斯优化和表面后高斯剪枝。前者沿着每条光线驱动最重要的高斯,以相对于场景表面拥有适当的位置和方向,后者利用修剪掩模明确地去除冗余的高斯。此外,我们设计了一种基于半监督学习的辅助训练策略来抵消过拟合。在不同的数据集上进行的实验表明,我们的SuraGS在各种指标上都优于最先进的少量NVS方法,效率得到了显著提高。特别是,所提出的方法可以在推理阶段减少高达75%的高斯数,在训练阶段减少高达65%的GPU内存使用。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

SuraGS: Toward efficient few-shot novel view synthesis via surface-aware Gaussian splatting

SuraGS: Toward efficient few-shot novel view synthesis via surface-aware Gaussian splatting
Neural radiance fields (NeRF) and 3D Gaussian splatting (3DGS) have benefited the task of novel view synthesis (NVS) that is critical to real-world applications. When extreme sparsity occurs in the training views (i.e., few-shot NVS), the synthesized view images may undergo a significant quality degradation due to overfitting. Although the conventional 3DGS-based methods for few-shot NVS have reached an impressive milestone in terms of synthesis quality and efficiency, they are yet to emphasize the awareness of the scene’s surface as well as the redundancy of Gaussians, which cause blurring and artifacts over complex regions in the scene. In this paper, we propose an efficient method for few-shot novel view synthesis with surface-aware Gaussian splatting (SuraGS). Specifically, we design two surface-aware strategies on surface Gaussian optimization and behind-surface Gaussian pruning. The former drives the foremost Gaussian along each ray to possess appropriate positions and orientations with respect to the scene’s surface, and the latter leverages a pruning mask to explicitly remove redundant Gaussians. In addition, we design an auxiliary training strategy based on semi-supervised learning to counteract overfitting. Experiments on different datasets demonstrate that our SuraGS outperforms state-of-the-art methods for few-shot NVS in various metrics, with substantial efficiency improvements. In particular, the proposed method can reduce the number of Gaussians by up to 75% at the inference stage and GPU memory usage by up to 65% at the training stage.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Computers & Graphics-Uk
Computers & Graphics-Uk 工程技术-计算机:软件工程
CiteScore
5.30
自引率
12.00%
发文量
173
审稿时长
38 days
期刊介绍: Computers & Graphics is dedicated to disseminate information on research and applications of computer graphics (CG) techniques. The journal encourages articles on: 1. Research and applications of interactive computer graphics. We are particularly interested in novel interaction techniques and applications of CG to problem domains. 2. State-of-the-art papers on late-breaking, cutting-edge research on CG. 3. Information on innovative uses of graphics principles and technologies. 4. Tutorial papers on both teaching CG principles and innovative uses of CG in education.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信