Registration of close-range, multi-lens multispectral imagery by retrieving the scene 3D structure

IF 10.6 1区 地球科学 Q1 GEOGRAPHY, PHYSICAL
Sylvain Jay , Frédéric Baret , Samuel Thomas , Marie Weiss
{"title":"Registration of close-range, multi-lens multispectral imagery by retrieving the scene 3D structure","authors":"Sylvain Jay ,&nbsp;Frédéric Baret ,&nbsp;Samuel Thomas ,&nbsp;Marie Weiss","doi":"10.1016/j.isprsjprs.2025.06.001","DOIUrl":null,"url":null,"abstract":"<div><div>Multispectral, multi-lens cameras, which acquire spectal images from different individual cameras equipped with different optical filters, are among the most widely used multispectral cameras available on the market. However, their use for close-range sensing is limited by the lack of registration algorithms capable of handling the strong parallax effects observed on scenes with non-negligible relief. In this paper, we propose a method based on stereo camera calibration and disparity estimation to register a close-range multispectral image while retrieving the corresponding 3D point cloud. The method takes advantage of the rigidity of these cameras and the synchronized capture of multispectral bands, both of which are thus compulsory. The algorithm is three-fold. First, the optimal combination of band pair alignments is found. Then, the semi-global matching stereovision algorithm combined with a robust matching cost function are used to align these band pairs and to compute the point cloud. Finally, a pixel filling step that exploits the spectral covariances of the different classes of materials in the image is implemented to limit the number of missing pixels, e.g., due to occlusions.</div><div>The method was tested on Airphen multispectral images of four plant crops (wheat, sunflower, cover crops and maize) acquired at a distance to the ground ranging from 1.5 to 3 m, thus encompassing a large variability in 3D structure and parallax effects. The results demonstrate that the proposed method achieves better registration performance than six state-of-the-art existing methods, while maintaining a reasonable processing time. Further, the point cloud provides accurate information on the 3D structure of the imaged scene, as shown by the centimetric plant height estimation accuracy. As the point cloud is aligned with the registered multispectral bands, the method provides a 4D (spectral and spatial) description of the scene with a single image, i.e., a multispectral point cloud. This opens up interesting prospects for several applications in close-range sensing including, but not restricted to, vegetation characterization.</div></div>","PeriodicalId":50269,"journal":{"name":"ISPRS Journal of Photogrammetry and Remote Sensing","volume":"227 ","pages":"Pages 125-144"},"PeriodicalIF":10.6000,"publicationDate":"2025-06-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ISPRS Journal of Photogrammetry and Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0924271625002229","RegionNum":1,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"GEOGRAPHY, PHYSICAL","Score":null,"Total":0}
引用次数: 0

Abstract

Multispectral, multi-lens cameras, which acquire spectal images from different individual cameras equipped with different optical filters, are among the most widely used multispectral cameras available on the market. However, their use for close-range sensing is limited by the lack of registration algorithms capable of handling the strong parallax effects observed on scenes with non-negligible relief. In this paper, we propose a method based on stereo camera calibration and disparity estimation to register a close-range multispectral image while retrieving the corresponding 3D point cloud. The method takes advantage of the rigidity of these cameras and the synchronized capture of multispectral bands, both of which are thus compulsory. The algorithm is three-fold. First, the optimal combination of band pair alignments is found. Then, the semi-global matching stereovision algorithm combined with a robust matching cost function are used to align these band pairs and to compute the point cloud. Finally, a pixel filling step that exploits the spectral covariances of the different classes of materials in the image is implemented to limit the number of missing pixels, e.g., due to occlusions.
The method was tested on Airphen multispectral images of four plant crops (wheat, sunflower, cover crops and maize) acquired at a distance to the ground ranging from 1.5 to 3 m, thus encompassing a large variability in 3D structure and parallax effects. The results demonstrate that the proposed method achieves better registration performance than six state-of-the-art existing methods, while maintaining a reasonable processing time. Further, the point cloud provides accurate information on the 3D structure of the imaged scene, as shown by the centimetric plant height estimation accuracy. As the point cloud is aligned with the registered multispectral bands, the method provides a 4D (spectral and spatial) description of the scene with a single image, i.e., a multispectral point cloud. This opens up interesting prospects for several applications in close-range sensing including, but not restricted to, vegetation characterization.
通过检索场景三维结构,实现近距离多镜头多光谱图像的配准
多光谱、多镜头相机是市场上使用最广泛的多光谱相机之一,它从配备不同滤光片的不同单个相机获取光谱图像。然而,由于缺乏能够处理在不可忽略的浮雕场景中观察到的强烈视差效果的配准算法,它们在近距离传感中的使用受到限制。本文提出了一种基于立体摄像机标定和视差估计的近距离多光谱图像配准方法,同时检索相应的三维点云。该方法利用了这些相机的刚性和多光谱波段的同步捕获,这两者都是必需的。这个算法有三层。首先,找出最优的带对对准组合。然后,结合鲁棒匹配代价函数的半全局匹配立体视觉算法对这些波段对进行对齐并计算点云。最后,利用图像中不同类别材料的光谱协方差来实现像素填充步骤,以限制缺失像素的数量,例如,由于遮挡。该方法在距离地面1.5至3米的四种植物作物(小麦、向日葵、覆盖作物和玉米)的Airphen多光谱图像上进行了测试,因此在3D结构和视差效果上具有很大的可变性。结果表明,该方法在保持合理处理时间的前提下,取得了比现有6种方法更好的配准性能。此外,点云提供了成像场景三维结构的准确信息,如厘米植物高度估计精度所示。由于点云与配准的多光谱波段对齐,该方法通过单幅图像提供场景的四维(光谱和空间)描述,即多光谱点云。这为近距离传感的几个应用开辟了有趣的前景,包括但不限于植被特征。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
ISPRS Journal of Photogrammetry and Remote Sensing
ISPRS Journal of Photogrammetry and Remote Sensing 工程技术-成像科学与照相技术
CiteScore
21.00
自引率
6.30%
发文量
273
审稿时长
40 days
期刊介绍: The ISPRS Journal of Photogrammetry and Remote Sensing (P&RS) serves as the official journal of the International Society for Photogrammetry and Remote Sensing (ISPRS). It acts as a platform for scientists and professionals worldwide who are involved in various disciplines that utilize photogrammetry, remote sensing, spatial information systems, computer vision, and related fields. The journal aims to facilitate communication and dissemination of advancements in these disciplines, while also acting as a comprehensive source of reference and archive. P&RS endeavors to publish high-quality, peer-reviewed research papers that are preferably original and have not been published before. These papers can cover scientific/research, technological development, or application/practical aspects. Additionally, the journal welcomes papers that are based on presentations from ISPRS meetings, as long as they are considered significant contributions to the aforementioned fields. In particular, P&RS encourages the submission of papers that are of broad scientific interest, showcase innovative applications (especially in emerging fields), have an interdisciplinary focus, discuss topics that have received limited attention in P&RS or related journals, or explore new directions in scientific or professional realms. It is preferred that theoretical papers include practical applications, while papers focusing on systems and applications should include a theoretical background.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信