E2D-GS: Event-enhanced deblurring gaussian splatting

IF 7.5 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Lifeng Lin, Shuangjie Yuan, Lu Yang
{"title":"E2D-GS: Event-enhanced deblurring gaussian splatting","authors":"Lifeng Lin,&nbsp;Shuangjie Yuan,&nbsp;Lu Yang","doi":"10.1016/j.eswa.2025.129802","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, implicit neural representations and explicit 3D Gaussian Splatting(3DGS) have demonstrated substantial advancements in the domain of novel view synthesis. Nevertheless, the efficacy of these approaches is predominantly contingent upon the availability of well-defined, clear imagery and precise camera pose information. Consequently, they exhibit a pronounced susceptibility to motion blur, which impedes the rendering of sharp images. Event cameras, which measure intensity changes with microsecond temporal precision, possess an inherent robustness to motion-induced blur. This characteristic offers new avenues for 3D reconstruction in challenging scenarios characterized by high-speed motion or low-light conditions. This paper introduces E2D-GS, a novel algorithm for deblurring and reconstruction based on event cameras and 3D Gaussian Splatting. To enhance reconstruction accuracy, our proposed framework leverages event streams to physically model the formation process of motion blur. This is achieved by optimizing the discrepancy between synthesized data and the observed blurry images, while simultaneously recovering the camera’s motion trajectory. Additionally, to enhance robustness in real-world scenarios, this paper proposes a differential consistency module. This module effectively mitigates noise within the event data and regularizes the optimization of Gaussian parameters, thereby improving reconstruction quality under non-ideal conditions. Comprehensive experimental evaluations on both simulated and real-world benchmarks validate the proposed method’s capability to reconstruct latent sharp imagery via the learned 3DGS representations, and further demonstrate its capacity for stable reconstruction under adverse scenarios. The results show that our approach surpasses the performance of previous works.</div></div>","PeriodicalId":50461,"journal":{"name":"Expert Systems with Applications","volume":"298 ","pages":"Article 129802"},"PeriodicalIF":7.5000,"publicationDate":"2025-09-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Expert Systems with Applications","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0957417425034177","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, implicit neural representations and explicit 3D Gaussian Splatting(3DGS) have demonstrated substantial advancements in the domain of novel view synthesis. Nevertheless, the efficacy of these approaches is predominantly contingent upon the availability of well-defined, clear imagery and precise camera pose information. Consequently, they exhibit a pronounced susceptibility to motion blur, which impedes the rendering of sharp images. Event cameras, which measure intensity changes with microsecond temporal precision, possess an inherent robustness to motion-induced blur. This characteristic offers new avenues for 3D reconstruction in challenging scenarios characterized by high-speed motion or low-light conditions. This paper introduces E2D-GS, a novel algorithm for deblurring and reconstruction based on event cameras and 3D Gaussian Splatting. To enhance reconstruction accuracy, our proposed framework leverages event streams to physically model the formation process of motion blur. This is achieved by optimizing the discrepancy between synthesized data and the observed blurry images, while simultaneously recovering the camera’s motion trajectory. Additionally, to enhance robustness in real-world scenarios, this paper proposes a differential consistency module. This module effectively mitigates noise within the event data and regularizes the optimization of Gaussian parameters, thereby improving reconstruction quality under non-ideal conditions. Comprehensive experimental evaluations on both simulated and real-world benchmarks validate the proposed method’s capability to reconstruct latent sharp imagery via the learned 3DGS representations, and further demonstrate its capacity for stable reconstruction under adverse scenarios. The results show that our approach surpasses the performance of previous works.
E2D-GS:事件增强去模糊高斯飞溅
近年来,隐式神经表征和显式三维高斯飞溅(3DGS)在新视图合成领域取得了实质性进展。然而,这些方法的有效性主要取决于是否有定义良好、清晰的图像和精确的相机姿势信息。因此,它们表现出明显的运动模糊的敏感性,这阻碍了清晰图像的呈现。事件相机以微秒级的时间精度测量强度变化,对运动引起的模糊具有固有的鲁棒性。这一特点为在高速运动或低光条件下具有挑战性的场景中进行3D重建提供了新的途径。本文介绍了一种基于事件相机和三维高斯溅射的图像去模糊重建算法E2D-GS。为了提高重建精度,我们提出的框架利用事件流对运动模糊的形成过程进行物理建模。这是通过优化合成数据和观察到的模糊图像之间的差异来实现的,同时恢复相机的运动轨迹。此外,为了增强现实场景下的鲁棒性,本文提出了一个差分一致性模块。该模块有效地减轻了事件数据中的噪声,并对高斯参数的优化进行了正则化,从而提高了非理想条件下的重建质量。在模拟和现实基准的综合实验评估中,验证了该方法通过学习到的3DGS表示重建潜在尖锐图像的能力,并进一步证明了其在不利场景下稳定重建的能力。结果表明,我们的方法的性能优于以往的工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Expert Systems with Applications
Expert Systems with Applications 工程技术-工程:电子与电气
CiteScore
13.80
自引率
10.60%
发文量
2045
审稿时长
8.7 months
期刊介绍: Expert Systems With Applications is an international journal dedicated to the exchange of information on expert and intelligent systems used globally in industry, government, and universities. The journal emphasizes original papers covering the design, development, testing, implementation, and management of these systems, offering practical guidelines. It spans various sectors such as finance, engineering, marketing, law, project management, information management, medicine, and more. The journal also welcomes papers on multi-agent systems, knowledge management, neural networks, knowledge discovery, data mining, and other related areas, excluding applications to military/defense systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信