面向水面和水下领域的机器人间定位的海上场景匹配

IF 5.2 2区 计算机科学 Q1 AUTOMATION & CONTROL SYSTEMS
John McConnell , Ivana Collado-Gonzalez , Paul Szenher , Armon Shariati
{"title":"面向水面和水下领域的机器人间定位的海上场景匹配","authors":"John McConnell ,&nbsp;Ivana Collado-Gonzalez ,&nbsp;Paul Szenher ,&nbsp;Armon Shariati","doi":"10.1016/j.robot.2025.105166","DOIUrl":null,"url":null,"abstract":"<div><div>Autonomous underwater vehicles (AUVs) play a crucial role across various sectors, including oil and gas production, civil engineering, and defense. However, underwater localization remains a significant challenge, limiting the widespread adoption of AUVs in these fields. A common strategy to address this challenge is to deploy a fleet of Uncrewed Surface Vessels (USVs), which are easier to localize on the surface, alongside AUVs to help anchor their position estimates. These methods typically rely on acoustic pinging among the robots to relay position-related data. Unfortunately, acoustic pinging requires synchronized clocks and a clear line of sight, making it difficult to deploy large-scale, decentralized teams, particularly in littoral (i.e. near-shore) environments.</div><div>To bridge this gap, we propose an alternative approach that is resolvable over asynchronous, intermittent communications. By leveraging the fact that many human-made structures in littoral environments are visible both above and below the waterline, our method automatically detects correspondences between above-water LiDAR scenes and underwater sonar scenes. First, we convert underwater and above-water data into a common representation, generate descriptors to find commonalities, and then apply point cloud registration tools to find rigid body transformations between them. Lastly, we apply pairwise consistent measurement set maximization (PCM) as a robust outlier rejection system. Our results demonstrate that our solution to this novel <em>Maritime Scene Matching (MSM) problem</em> is both robust to outliers and effective in localizing sonar scenes with an accuracy of less than two meters. Datasets are collected using a single robot equipped with underwater imaging sonar and above-water LiDAR. We have made our real-world datasets, hardware designs, and open-source code available to promote reproducibility and to encourage broader community engagement with the MSM problem. Opensource code: <span><span>https://github.com/jake3991/maritime-scene-matching</span><svg><path></path></svg></span>.</div></div>","PeriodicalId":49592,"journal":{"name":"Robotics and Autonomous Systems","volume":"194 ","pages":"Article 105166"},"PeriodicalIF":5.2000,"publicationDate":"2025-08-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Maritime scene matching for inter-robot localization across surface and underwater domains\",\"authors\":\"John McConnell ,&nbsp;Ivana Collado-Gonzalez ,&nbsp;Paul Szenher ,&nbsp;Armon Shariati\",\"doi\":\"10.1016/j.robot.2025.105166\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Autonomous underwater vehicles (AUVs) play a crucial role across various sectors, including oil and gas production, civil engineering, and defense. However, underwater localization remains a significant challenge, limiting the widespread adoption of AUVs in these fields. A common strategy to address this challenge is to deploy a fleet of Uncrewed Surface Vessels (USVs), which are easier to localize on the surface, alongside AUVs to help anchor their position estimates. These methods typically rely on acoustic pinging among the robots to relay position-related data. Unfortunately, acoustic pinging requires synchronized clocks and a clear line of sight, making it difficult to deploy large-scale, decentralized teams, particularly in littoral (i.e. near-shore) environments.</div><div>To bridge this gap, we propose an alternative approach that is resolvable over asynchronous, intermittent communications. By leveraging the fact that many human-made structures in littoral environments are visible both above and below the waterline, our method automatically detects correspondences between above-water LiDAR scenes and underwater sonar scenes. First, we convert underwater and above-water data into a common representation, generate descriptors to find commonalities, and then apply point cloud registration tools to find rigid body transformations between them. Lastly, we apply pairwise consistent measurement set maximization (PCM) as a robust outlier rejection system. Our results demonstrate that our solution to this novel <em>Maritime Scene Matching (MSM) problem</em> is both robust to outliers and effective in localizing sonar scenes with an accuracy of less than two meters. Datasets are collected using a single robot equipped with underwater imaging sonar and above-water LiDAR. We have made our real-world datasets, hardware designs, and open-source code available to promote reproducibility and to encourage broader community engagement with the MSM problem. Opensource code: <span><span>https://github.com/jake3991/maritime-scene-matching</span><svg><path></path></svg></span>.</div></div>\",\"PeriodicalId\":49592,\"journal\":{\"name\":\"Robotics and Autonomous Systems\",\"volume\":\"194 \",\"pages\":\"Article 105166\"},\"PeriodicalIF\":5.2000,\"publicationDate\":\"2025-08-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Robotics and Autonomous Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0921889025002635\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AUTOMATION & CONTROL SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Robotics and Autonomous Systems","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0921889025002635","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AUTOMATION & CONTROL SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

自主水下航行器(auv)在各个领域发挥着至关重要的作用,包括石油和天然气生产、土木工程和国防。然而,水下定位仍然是一个重大挑战,限制了auv在这些领域的广泛采用。应对这一挑战的一种常见策略是部署无人水面舰艇(usv)舰队,usv更容易在水面上定位,与auv一起帮助定位它们的位置。这些方法通常依赖于机器人之间的声学ping来传递与位置相关的数据。不幸的是,声波ping需要同步时钟和清晰的视线,这使得部署大规模、分散的团队变得困难,特别是在沿海(即近岸)环境中。为了弥补这一差距,我们提出了一种可通过异步、间歇通信解决的替代方法。利用沿海环境中许多人造建筑在水线以上和水线以下都可见的事实,我们的方法可以自动检测水上激光雷达场景和水下声纳场景之间的对应关系。首先,我们将水下和水上数据转换成一个共同的表示,生成描述符来找到共性,然后应用点云配准工具来找到它们之间的刚体转换。最后,我们应用两两一致测量集最大化(PCM)作为鲁棒的离群值抑制系统。我们的结果表明,我们的解决方案对这种新颖的海上场景匹配(MSM)问题既对异常值具有鲁棒性,又能有效地定位精度小于2米的声纳场景。数据集的收集使用配备水下成像声纳和水上激光雷达的单个机器人。我们已经公开了真实世界的数据集、硬件设计和开源代码,以提高可重复性,并鼓励更广泛的社区参与到MSM问题中来。开源代码:https://github.com/jake3991/maritime-scene-matching。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Maritime scene matching for inter-robot localization across surface and underwater domains
Autonomous underwater vehicles (AUVs) play a crucial role across various sectors, including oil and gas production, civil engineering, and defense. However, underwater localization remains a significant challenge, limiting the widespread adoption of AUVs in these fields. A common strategy to address this challenge is to deploy a fleet of Uncrewed Surface Vessels (USVs), which are easier to localize on the surface, alongside AUVs to help anchor their position estimates. These methods typically rely on acoustic pinging among the robots to relay position-related data. Unfortunately, acoustic pinging requires synchronized clocks and a clear line of sight, making it difficult to deploy large-scale, decentralized teams, particularly in littoral (i.e. near-shore) environments.
To bridge this gap, we propose an alternative approach that is resolvable over asynchronous, intermittent communications. By leveraging the fact that many human-made structures in littoral environments are visible both above and below the waterline, our method automatically detects correspondences between above-water LiDAR scenes and underwater sonar scenes. First, we convert underwater and above-water data into a common representation, generate descriptors to find commonalities, and then apply point cloud registration tools to find rigid body transformations between them. Lastly, we apply pairwise consistent measurement set maximization (PCM) as a robust outlier rejection system. Our results demonstrate that our solution to this novel Maritime Scene Matching (MSM) problem is both robust to outliers and effective in localizing sonar scenes with an accuracy of less than two meters. Datasets are collected using a single robot equipped with underwater imaging sonar and above-water LiDAR. We have made our real-world datasets, hardware designs, and open-source code available to promote reproducibility and to encourage broader community engagement with the MSM problem. Opensource code: https://github.com/jake3991/maritime-scene-matching.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Robotics and Autonomous Systems
Robotics and Autonomous Systems 工程技术-机器人学
CiteScore
9.00
自引率
7.00%
发文量
164
审稿时长
4.5 months
期刊介绍: Robotics and Autonomous Systems will carry articles describing fundamental developments in the field of robotics, with special emphasis on autonomous systems. An important goal of this journal is to extend the state of the art in both symbolic and sensory based robot control and learning in the context of autonomous systems. Robotics and Autonomous Systems will carry articles on the theoretical, computational and experimental aspects of autonomous systems, or modules of such systems.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信