Perceptual Quality Assessment for NeRF-Generated Scenes: A Training Reference Metric

IF 3.6 3区 计算机科学 Q2 COMPUTER SCIENCE, INFORMATION SYSTEMS
Shihao Luo;Nguyen Tien Phong;Chibuike Onuoha;Truong Cong Thang
{"title":"Perceptual Quality Assessment for NeRF-Generated Scenes: A Training Reference Metric","authors":"Shihao Luo;Nguyen Tien Phong;Chibuike Onuoha;Truong Cong Thang","doi":"10.1109/ACCESS.2025.3603970","DOIUrl":null,"url":null,"abstract":"Perceptual quality assessment is a key challenge in traditional image processing as well as emerging AI-based novel view synthesis methods such as neural radiation fields (NeRF). NeRF has revolutionized 3D scene reconstruction by leveraging neural networks for volumetric rendering, achieving unprecedented photorealistic results. Currently, the perceptual quality assessment of NeRF still relies heavily on full-reference (FR) metrics, such as PSNR and SSIM, which require external reference images from predefined camera positions and suffer from significant limitations. In this paper, we propose a new quality metric that directly leverages training views to quantify the perceptual quality of NeRF-generated scenes, eliminating the need for external predefined reference images and camera position metadata. In the proposed approach, we first extracted hierarchical and abstract features from training views using pretrained deep convolutional neural networks (CNNs) and then constructed a reference representation for perceptual quality evaluation through feature space interpolation. We evaluated the effectiveness of the proposed approach with two options: one with pretrained CNNs only (without calibration) and another with calibration applied to learn the importance of hierarchical feature stages. The experimental results demonstrate the effectiveness of the proposed method, which outperforms traditional no-reference (NR) metrics while being comparable to popular FR metrics. We found that deep features trained for high-level classification tasks have a strong potential to quantify perceptual quality across different viewpoints of the same object in NeRF. The code is released at <uri>https://github.com/WemoLuo/NVQS</uri>","PeriodicalId":13079,"journal":{"name":"IEEE Access","volume":"13 ","pages":"152277-152292"},"PeriodicalIF":3.6000,"publicationDate":"2025-08-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11145061","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Access","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/11145061/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

Abstract

Perceptual quality assessment is a key challenge in traditional image processing as well as emerging AI-based novel view synthesis methods such as neural radiation fields (NeRF). NeRF has revolutionized 3D scene reconstruction by leveraging neural networks for volumetric rendering, achieving unprecedented photorealistic results. Currently, the perceptual quality assessment of NeRF still relies heavily on full-reference (FR) metrics, such as PSNR and SSIM, which require external reference images from predefined camera positions and suffer from significant limitations. In this paper, we propose a new quality metric that directly leverages training views to quantify the perceptual quality of NeRF-generated scenes, eliminating the need for external predefined reference images and camera position metadata. In the proposed approach, we first extracted hierarchical and abstract features from training views using pretrained deep convolutional neural networks (CNNs) and then constructed a reference representation for perceptual quality evaluation through feature space interpolation. We evaluated the effectiveness of the proposed approach with two options: one with pretrained CNNs only (without calibration) and another with calibration applied to learn the importance of hierarchical feature stages. The experimental results demonstrate the effectiveness of the proposed method, which outperforms traditional no-reference (NR) metrics while being comparable to popular FR metrics. We found that deep features trained for high-level classification tasks have a strong potential to quantify perceptual quality across different viewpoints of the same object in NeRF. The code is released at https://github.com/WemoLuo/NVQS
nerf生成场景的感知质量评估:一个训练参考度量
感知质量评估是传统图像处理以及新兴的基于人工智能的新型视图合成方法(如神经辐射场(NeRF))面临的关键挑战。NeRF通过利用神经网络进行体渲染,彻底改变了3D场景重建,实现了前所未有的逼真效果。目前,NeRF的感知质量评估仍然严重依赖于全参考(FR)指标,如PSNR和SSIM,这些指标需要来自预定义摄像机位置的外部参考图像,并且存在很大的局限性。在本文中,我们提出了一种新的质量度量,直接利用训练视图来量化nerf生成场景的感知质量,从而消除了对外部预定义参考图像和相机位置元数据的需求。在该方法中,我们首先使用预训练的深度卷积神经网络(cnn)从训练视图中提取层次和抽象特征,然后通过特征空间插值构建用于感知质量评估的参考表示。我们用两种方法评估了所提出方法的有效性:一种是只使用预训练的cnn(没有校准),另一种是使用校准来学习分层特征阶段的重要性。实验结果证明了该方法的有效性,该方法优于传统的无参考(NR)指标,同时与流行的FR指标相当。我们发现,为高级分类任务训练的深度特征在量化NeRF中同一对象的不同视点的感知质量方面具有很强的潜力。该代码发布在https://github.com/WemoLuo/NVQS
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Access
IEEE Access COMPUTER SCIENCE, INFORMATION SYSTEMSENGIN-ENGINEERING, ELECTRICAL & ELECTRONIC
CiteScore
9.80
自引率
7.70%
发文量
6673
审稿时长
6 weeks
期刊介绍: IEEE Access® is a multidisciplinary, open access (OA), applications-oriented, all-electronic archival journal that continuously presents the results of original research or development across all of IEEE''s fields of interest. IEEE Access will publish articles that are of high interest to readers, original, technically correct, and clearly presented. Supported by author publication charges (APC), its hallmarks are a rapid peer review and publication process with open access to all readers. Unlike IEEE''s traditional Transactions or Journals, reviews are "binary", in that reviewers will either Accept or Reject an article in the form it is submitted in order to achieve rapid turnaround. Especially encouraged are submissions on: Multidisciplinary topics, or applications-oriented articles and negative results that do not fit within the scope of IEEE''s traditional journals. Practical articles discussing new experiments or measurement techniques, interesting solutions to engineering. Development of new or improved fabrication or manufacturing techniques. Reviews or survey articles of new or evolving fields oriented to assist others in understanding the new area.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信