A Probabilistic Formulation of LiDAR Mapping With Neural Radiance Fields

IF 4.6 2区 计算机科学 Q2 ROBOTICS
Matthew McDermott;Jason Rife
{"title":"A Probabilistic Formulation of LiDAR Mapping With Neural Radiance Fields","authors":"Matthew McDermott;Jason Rife","doi":"10.1109/LRA.2025.3557301","DOIUrl":null,"url":null,"abstract":"In this letter we reexamine the process through which a Neural Radiance Field (NeRF) can be trained to produce novel LiDAR views of a scene. Unlike image applications where camera pixels integrate light over time, LiDAR pulses arrive at specific times. As such, multiple LiDAR returns are possible for any given detector and the classification of these returns is inherently probabilistic. Applying a traditional NeRF training routine can result in the network learning “phantom surfaces” in free space between conflicting range measurements, similar to how “floater” aberrations may be produced by an image model. We show that by formulating loss as an integral of probability (rather than as an integral of optical density) the network can learn multiple peaks for a given ray, allowing the sampling of first, <inline-formula><tex-math>$\\text{n}^{\\text{th}}$</tex-math></inline-formula>, or strongest returns from a single output channel.","PeriodicalId":13241,"journal":{"name":"IEEE Robotics and Automation Letters","volume":"10 6","pages":"5409-5416"},"PeriodicalIF":4.6000,"publicationDate":"2025-04-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10947591","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Robotics and Automation Letters","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10947591/","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"ROBOTICS","Score":null,"Total":0}
引用次数: 0

Abstract

In this letter we reexamine the process through which a Neural Radiance Field (NeRF) can be trained to produce novel LiDAR views of a scene. Unlike image applications where camera pixels integrate light over time, LiDAR pulses arrive at specific times. As such, multiple LiDAR returns are possible for any given detector and the classification of these returns is inherently probabilistic. Applying a traditional NeRF training routine can result in the network learning “phantom surfaces” in free space between conflicting range measurements, similar to how “floater” aberrations may be produced by an image model. We show that by formulating loss as an integral of probability (rather than as an integral of optical density) the network can learn multiple peaks for a given ray, allowing the sampling of first, $\text{n}^{\text{th}}$, or strongest returns from a single output channel.
利用神经辐射场对激光雷达测绘进行概率计算
在这封信中,我们重新审视了神经辐射场(NeRF)可以被训练以产生新的场景激光雷达视图的过程。不像图像应用,相机像素随着时间的推移集成光,激光雷达脉冲到达特定的时间。因此,对于任何给定的探测器,都可能有多个LiDAR返回,并且这些返回的分类本质上是概率性的。应用传统的NeRF训练程序可能导致网络在冲突距离测量之间的自由空间中学习“幻影表面”,类似于图像模型如何产生“浮子”像差。我们表明,通过将损失表示为概率的积分(而不是光密度的积分),网络可以学习给定射线的多个峰值,允许首先采样$\text{n}^{\text{th}}$,或者从单个输出通道获得最强回报。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Robotics and Automation Letters
IEEE Robotics and Automation Letters Computer Science-Computer Science Applications
CiteScore
9.60
自引率
15.40%
发文量
1428
期刊介绍: The scope of this journal is to publish peer-reviewed articles that provide a timely and concise account of innovative research ideas and application results, reporting significant theoretical findings and application case studies in areas of robotics and automation.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信