SeG-Gaussian:基于分割引导的新视图合成三维高斯优化。

IF 6.5
Ling-Xiao Zhang, Chenbo Jiang, Yu-Kun Lai, Lin Gao
{"title":"SeG-Gaussian:基于分割引导的新视图合成三维高斯优化。","authors":"Ling-Xiao Zhang, Chenbo Jiang, Yu-Kun Lai, Lin Gao","doi":"10.1109/TVCG.2025.3615421","DOIUrl":null,"url":null,"abstract":"<p><p>Radiance field based methods have recently revolutionized novel view synthesis of scenes captured with multi-view photos. A significant recent advance is 3D Gaussian Splatting (3DGS), which utilizes a set of 3D Gaussians to represent a radiance field, yielding high-fidelity results in real-time rendering. However, we have observed that 3DGS struggles to capture the necessary details in sparsely observed regions, where there is not enough gradient for effective split and clone operations. In this paper, we present a novel solution to address this limitation. Our key idea is to leverage segmentation information to identify poorly optimized regions within the 3D Gaussian representation. By applying split or clone operations on the corresponding 3D Gaussians in these regions, we aim to refine the spatial distribution of Gaussians and enhance the overall quality of high-fidelity 3D scene reconstruction. To further optimize the reconstruction process, we introduce two spatial regularization terms: repulsion loss and smoothness loss. These terms effectively minimize overlap and redundancy among Gaussians, reducing outliers in the synthesized geometry. By incorporating these regularization techniques, our approach achieves state-of-the-art performance in real-time novel view synthesis and significantly improves visibility in less observed regions, leading to a more compact and accurate 3D scene representation.</p>","PeriodicalId":94035,"journal":{"name":"IEEE transactions on visualization and computer graphics","volume":"PP ","pages":""},"PeriodicalIF":6.5000,"publicationDate":"2025-09-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SeG-Gaussian:Segmentation-Guided 3D Gaussian Optimization for Novel View Synthesis.\",\"authors\":\"Ling-Xiao Zhang, Chenbo Jiang, Yu-Kun Lai, Lin Gao\",\"doi\":\"10.1109/TVCG.2025.3615421\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Radiance field based methods have recently revolutionized novel view synthesis of scenes captured with multi-view photos. A significant recent advance is 3D Gaussian Splatting (3DGS), which utilizes a set of 3D Gaussians to represent a radiance field, yielding high-fidelity results in real-time rendering. However, we have observed that 3DGS struggles to capture the necessary details in sparsely observed regions, where there is not enough gradient for effective split and clone operations. In this paper, we present a novel solution to address this limitation. Our key idea is to leverage segmentation information to identify poorly optimized regions within the 3D Gaussian representation. By applying split or clone operations on the corresponding 3D Gaussians in these regions, we aim to refine the spatial distribution of Gaussians and enhance the overall quality of high-fidelity 3D scene reconstruction. To further optimize the reconstruction process, we introduce two spatial regularization terms: repulsion loss and smoothness loss. These terms effectively minimize overlap and redundancy among Gaussians, reducing outliers in the synthesized geometry. By incorporating these regularization techniques, our approach achieves state-of-the-art performance in real-time novel view synthesis and significantly improves visibility in less observed regions, leading to a more compact and accurate 3D scene representation.</p>\",\"PeriodicalId\":94035,\"journal\":{\"name\":\"IEEE transactions on visualization and computer graphics\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":6.5000,\"publicationDate\":\"2025-09-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on visualization and computer graphics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/TVCG.2025.3615421\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on visualization and computer graphics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TVCG.2025.3615421","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

基于辐射场的方法最近彻底改变了用多视图照片捕获的场景的新视图合成。最近的一项重大进展是3D高斯喷溅(3DGS),它利用一组3D高斯来表示辐射场,在实时渲染中产生高保真度的结果。然而,我们观察到3DGS很难在稀疏观察区域捕捉必要的细节,在那里没有足够的梯度进行有效的分裂和克隆操作。在本文中,我们提出了一种新的解决方案来解决这一限制。我们的关键思想是利用分割信息来识别3D高斯表示中优化不良的区域。通过对这些区域对应的三维高斯量进行分割或克隆操作,细化高斯量的空间分布,提高高保真三维场景重建的整体质量。为了进一步优化重构过程,我们引入了两个空间正则化项:斥力损失和平滑损失。这些项有效地减少了高斯分布之间的重叠和冗余,减少了合成几何中的异常值。通过结合这些正则化技术,我们的方法在实时新视图合成中实现了最先进的性能,并显着提高了较少观察区域的可见性,从而实现了更紧凑和准确的3D场景表示。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SeG-Gaussian:Segmentation-Guided 3D Gaussian Optimization for Novel View Synthesis.

Radiance field based methods have recently revolutionized novel view synthesis of scenes captured with multi-view photos. A significant recent advance is 3D Gaussian Splatting (3DGS), which utilizes a set of 3D Gaussians to represent a radiance field, yielding high-fidelity results in real-time rendering. However, we have observed that 3DGS struggles to capture the necessary details in sparsely observed regions, where there is not enough gradient for effective split and clone operations. In this paper, we present a novel solution to address this limitation. Our key idea is to leverage segmentation information to identify poorly optimized regions within the 3D Gaussian representation. By applying split or clone operations on the corresponding 3D Gaussians in these regions, we aim to refine the spatial distribution of Gaussians and enhance the overall quality of high-fidelity 3D scene reconstruction. To further optimize the reconstruction process, we introduce two spatial regularization terms: repulsion loss and smoothness loss. These terms effectively minimize overlap and redundancy among Gaussians, reducing outliers in the synthesized geometry. By incorporating these regularization techniques, our approach achieves state-of-the-art performance in real-time novel view synthesis and significantly improves visibility in less observed regions, leading to a more compact and accurate 3D scene representation.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信