动态SfM:一种鲁棒的近实时动态SfM,适用于来自多个智能体的时空无序高分辨率图像

IF 10.6 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Zongqian Zhan , Yifei Yu , Rui Xia , Wentian Gan , Hong Xie , Giulio Perda , Luca Morelli , Fabio Remondino , Xin Wang
{"title":"动态SfM:一种鲁棒的近实时动态SfM,适用于来自多个智能体的时空无序高分辨率图像","authors":"Zongqian Zhan ,&nbsp;Yifei Yu ,&nbsp;Rui Xia ,&nbsp;Wentian Gan ,&nbsp;Hong Xie ,&nbsp;Giulio Perda ,&nbsp;Luca Morelli ,&nbsp;Fabio Remondino ,&nbsp;Xin Wang","doi":"10.1016/j.isprsjprs.2025.04.002","DOIUrl":null,"url":null,"abstract":"<div><div>In the last twenty years, Structure from Motion (SfM) has been a constant research hotspot in the fields of photogrammetry, computer vision, robotics etc., whereas real-time performance has only recently emerged as a topic of growing interest. This work builds upon the original on-the-fly SfM (Zhan et al., 2024) and presents an updated version (v2) with three new advancements to get better SfM reconstruction results during image capturing: (i) near real-time image matching is further boosted by employing the Hierarchical Navigable Small World (HNSW) graphs, and more true positive overlapping image candidates can be faster identified; (ii) a self-adaptive weighting strategy is proposed for robust hierarchical local bundle adjustment to improve the SfM results; (iii) multiple agents are included for supporting collaborative SfM and seamlessly merge multiple 3D reconstructions into a complete 3D scene in presence of commonly registered images. Various comprehensive experiments demonstrate that the proposed SfM method (named on-the-fly SfMv2) can generate more complete and robust 3D reconstructions in a time-efficient way. Code is available at <span><span>http://yifeiyu225.github.io/on-the-flySfMv2.github.io/</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"224 ","pages":"Pages 202-221"},"PeriodicalIF":10.6000,"publicationDate":"2025-04-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SfM on-the-fly: A robust near real-time SfM for spatiotemporally disordered high-resolution imagery from multiple agents\",\"authors\":\"Zongqian Zhan ,&nbsp;Yifei Yu ,&nbsp;Rui Xia ,&nbsp;Wentian Gan ,&nbsp;Hong Xie ,&nbsp;Giulio Perda ,&nbsp;Luca Morelli ,&nbsp;Fabio Remondino ,&nbsp;Xin Wang\",\"doi\":\"10.1016/j.isprsjprs.2025.04.002\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>In the last twenty years, Structure from Motion (SfM) has been a constant research hotspot in the fields of photogrammetry, computer vision, robotics etc., whereas real-time performance has only recently emerged as a topic of growing interest. This work builds upon the original on-the-fly SfM (Zhan et al., 2024) and presents an updated version (v2) with three new advancements to get better SfM reconstruction results during image capturing: (i) near real-time image matching is further boosted by employing the Hierarchical Navigable Small World (HNSW) graphs, and more true positive overlapping image candidates can be faster identified; (ii) a self-adaptive weighting strategy is proposed for robust hierarchical local bundle adjustment to improve the SfM results; (iii) multiple agents are included for supporting collaborative SfM and seamlessly merge multiple 3D reconstructions into a complete 3D scene in presence of commonly registered images. Various comprehensive experiments demonstrate that the proposed SfM method (named on-the-fly SfMv2) can generate more complete and robust 3D reconstructions in a time-efficient way. Code is available at <span><span>http://yifeiyu225.github.io/on-the-flySfMv2.github.io/</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":50269,\"journal\":{\"name\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"volume\":\"224 \",\"pages\":\"Pages 202-221\"},\"PeriodicalIF\":10.6000,\"publicationDate\":\"2025-04-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ISPRS Journal of Photogrammetry and Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0924271625001388\",\"RegionNum\":1,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"GEOGRAPHY, PHYSICAL\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625001388","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

摘要

在过去的二十年里,运动结构(SfM)一直是摄影测量、计算机视觉、机器人等领域的研究热点,而实时性能只是最近才成为人们越来越感兴趣的话题。这项工作建立在原始的实时SfM (Zhan et al., 2024)的基础上,并提出了一个更新版本(v2),其中有三个新的进步,以便在图像捕获期间获得更好的SfM重建结果:(i)通过使用分层可导航小世界(HNSW)图进一步增强了近实时图像匹配,并且可以更快地识别更多的真正重叠图像候选;(ii)提出了一种自适应加权策略用于鲁棒分层局部束平差,以改善SfM结果;(iii)包含多个代理以支持协同SfM,并在常见注册图像存在的情况下将多个3D重建无缝合并为完整的3D场景。各种综合实验表明,提出的SfM方法(命名为on-the-fly SfMv2)可以以更高效的方式生成更完整和鲁棒的3D重建。代码可从http://yifeiyu225.github.io/on-the-flySfMv2.github.io/获得。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SfM on-the-fly: A robust near real-time SfM for spatiotemporally disordered high-resolution imagery from multiple agents
In the last twenty years, Structure from Motion (SfM) has been a constant research hotspot in the fields of photogrammetry, computer vision, robotics etc., whereas real-time performance has only recently emerged as a topic of growing interest. This work builds upon the original on-the-fly SfM (Zhan et al., 2024) and presents an updated version (v2) with three new advancements to get better SfM reconstruction results during image capturing: (i) near real-time image matching is further boosted by employing the Hierarchical Navigable Small World (HNSW) graphs, and more true positive overlapping image candidates can be faster identified; (ii) a self-adaptive weighting strategy is proposed for robust hierarchical local bundle adjustment to improve the SfM results; (iii) multiple agents are included for supporting collaborative SfM and seamlessly merge multiple 3D reconstructions into a complete 3D scene in presence of commonly registered images. Various comprehensive experiments demonstrate that the proposed SfM method (named on-the-fly SfMv2) can generate more complete and robust 3D reconstructions in a time-efficient way. Code is available at http://yifeiyu225.github.io/on-the-flySfMv2.github.io/.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信