{"title":"Generating 3D Multispectral Point Clouds of Plants with Fusion of Snapshot Spectral and RGB-D Images.","authors":"Pengyao Xie, Ruiming Du, Zhihong Ma, Haiyan Cen","doi":"10.34133/plantphenomics.0040","DOIUrl":null,"url":null,"abstract":"<p><p>Accurate and high-throughput plant phenotyping is important for accelerating crop breeding. Spectral imaging that can acquire both spectral and spatial information of plants related to structural, biochemical, and physiological traits becomes one of the popular phenotyping techniques. However, close-range spectral imaging of plants could be highly affected by the complex plant structure and illumination conditions, which becomes one of the main challenges for close-range plant phenotyping. In this study, we proposed a new method for generating high-quality plant 3-dimensional multispectral point clouds. Speeded-Up Robust Features and Demons was used for fusing depth and snapshot spectral images acquired at close range. A reflectance correction method for plant spectral images based on hemisphere references combined with artificial neural network was developed for eliminating the illumination effects. The proposed Speeded-Up Robust Features and Demons achieved an average structural similarity index measure of 0.931, outperforming the classic approaches with an average structural similarity index measure of 0.889 in RGB and snapshot spectral image registration. The distribution of digital number values of the references at different positions and orientations was simulated using artificial neural network with the determination coefficient (<i>R</i> <sup>2</sup>) of 0.962 and root mean squared error of 0.036. Compared with the ground truth measured by ASD spectrometer, the average root mean squared error of the reflectance spectra before and after reflectance correction at different leaf positions decreased by 78.0%. For the same leaf position, the average Euclidean distances between the multiview reflectance spectra decreased by 60.7%. Our results indicate that the proposed method achieves a good performance in generating plant 3-dimensional multispectral point clouds, which is promising for close-range plant phenotyping.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0040"},"PeriodicalIF":7.6000,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10069917/pdf/","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Plant Phenomics","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.34133/plantphenomics.0040","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRONOMY","Score":null,"Total":0}
引用次数: 4
Abstract
Accurate and high-throughput plant phenotyping is important for accelerating crop breeding. Spectral imaging that can acquire both spectral and spatial information of plants related to structural, biochemical, and physiological traits becomes one of the popular phenotyping techniques. However, close-range spectral imaging of plants could be highly affected by the complex plant structure and illumination conditions, which becomes one of the main challenges for close-range plant phenotyping. In this study, we proposed a new method for generating high-quality plant 3-dimensional multispectral point clouds. Speeded-Up Robust Features and Demons was used for fusing depth and snapshot spectral images acquired at close range. A reflectance correction method for plant spectral images based on hemisphere references combined with artificial neural network was developed for eliminating the illumination effects. The proposed Speeded-Up Robust Features and Demons achieved an average structural similarity index measure of 0.931, outperforming the classic approaches with an average structural similarity index measure of 0.889 in RGB and snapshot spectral image registration. The distribution of digital number values of the references at different positions and orientations was simulated using artificial neural network with the determination coefficient (R2) of 0.962 and root mean squared error of 0.036. Compared with the ground truth measured by ASD spectrometer, the average root mean squared error of the reflectance spectra before and after reflectance correction at different leaf positions decreased by 78.0%. For the same leaf position, the average Euclidean distances between the multiview reflectance spectra decreased by 60.7%. Our results indicate that the proposed method achieves a good performance in generating plant 3-dimensional multispectral point clouds, which is promising for close-range plant phenotyping.
期刊介绍:
Plant Phenomics is an Open Access journal published in affiliation with the State Key Laboratory of Crop Genetics & Germplasm Enhancement, Nanjing Agricultural University (NAU) and published by the American Association for the Advancement of Science (AAAS). Like all partners participating in the Science Partner Journal program, Plant Phenomics is editorially independent from the Science family of journals.
The mission of Plant Phenomics is to publish novel research that will advance all aspects of plant phenotyping from the cell to the plant population levels using innovative combinations of sensor systems and data analytics. Plant Phenomics aims also to connect phenomics to other science domains, such as genomics, genetics, physiology, molecular biology, bioinformatics, statistics, mathematics, and computer sciences. Plant Phenomics should thus contribute to advance plant sciences and agriculture/forestry/horticulture by addressing key scientific challenges in the area of plant phenomics.
The scope of the journal covers the latest technologies in plant phenotyping for data acquisition, data management, data interpretation, modeling, and their practical applications for crop cultivation, plant breeding, forestry, horticulture, ecology, and other plant-related domains.