Biomass phenotyping of oilseed rape through UAV multi-view oblique imaging with 3DGS and SAM model

IF 7.7 1区 农林科学 Q1 AGRICULTURE, MULTIDISCIPLINARY
Yutao Shen , Hongyu Zhou , Xin Yang , Xuqi Lu , Ziyue Guo , Lixi Jiang , Yong He , Haiyan Cen
{"title":"Biomass phenotyping of oilseed rape through UAV multi-view oblique imaging with 3DGS and SAM model","authors":"Yutao Shen ,&nbsp;Hongyu Zhou ,&nbsp;Xin Yang ,&nbsp;Xuqi Lu ,&nbsp;Ziyue Guo ,&nbsp;Lixi Jiang ,&nbsp;Yong He ,&nbsp;Haiyan Cen","doi":"10.1016/j.compag.2025.110320","DOIUrl":null,"url":null,"abstract":"<div><div>Biomass estimation of oilseed rape is crucial for optimizing crop productivity and breeding strategies. While UAV-based imaging has advanced high-throughput phenotyping, current methods often rely on orthophoto images, which struggle with overlapping leaves and incomplete structural information in complex field environments. This study integrates 3D Gaussian Splatting (3DGS) with the Segment Anything Model (SAM) for precise 3D reconstruction and biomass estimation of oilseed rape. UAV multi-view oblique images from 36 angles were used to perform 3D reconstruction, with the SAM module enhancing point cloud segmentation. The segmented point clouds were then converted into point cloud volumes, which were fitted to ground-measured biomass using linear regression. The results showed that 3DGS (7 k and 30 k iterations) provided high accuracy, with peak signal-to-noise ratios (PSNR) of 27.43 and 29.53 and training times of 7 and 49 min, respectively. This performance exceeded that of structure from motion (SfM) and mipmap Neural Radiance Fields (Mip-NeRF), demonstrating superior efficiency. The SAM module achieved high segmentation accuracy, with a mean intersection over union (mIoU) of 0.961 and an F1-score of 0.980. Additionally, a comparison of biomass extraction models found the point cloud volume model to be the most accurate, with an determination coefficient (R<sup>2</sup>) of 0.976, root mean square error (RMSE) of 2.92 g/plant, and mean absolute percentage error (MAPE) of 6.81 %, outperforming both the plot crop volume and individual crop volume models. This study highlights the potential of combining 3DGS with multi-view UAV imaging for improved biomass phenotyping.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"235 ","pages":"Article 110320"},"PeriodicalIF":7.7000,"publicationDate":"2025-03-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925004260","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

Biomass estimation of oilseed rape is crucial for optimizing crop productivity and breeding strategies. While UAV-based imaging has advanced high-throughput phenotyping, current methods often rely on orthophoto images, which struggle with overlapping leaves and incomplete structural information in complex field environments. This study integrates 3D Gaussian Splatting (3DGS) with the Segment Anything Model (SAM) for precise 3D reconstruction and biomass estimation of oilseed rape. UAV multi-view oblique images from 36 angles were used to perform 3D reconstruction, with the SAM module enhancing point cloud segmentation. The segmented point clouds were then converted into point cloud volumes, which were fitted to ground-measured biomass using linear regression. The results showed that 3DGS (7 k and 30 k iterations) provided high accuracy, with peak signal-to-noise ratios (PSNR) of 27.43 and 29.53 and training times of 7 and 49 min, respectively. This performance exceeded that of structure from motion (SfM) and mipmap Neural Radiance Fields (Mip-NeRF), demonstrating superior efficiency. The SAM module achieved high segmentation accuracy, with a mean intersection over union (mIoU) of 0.961 and an F1-score of 0.980. Additionally, a comparison of biomass extraction models found the point cloud volume model to be the most accurate, with an determination coefficient (R2) of 0.976, root mean square error (RMSE) of 2.92 g/plant, and mean absolute percentage error (MAPE) of 6.81 %, outperforming both the plot crop volume and individual crop volume models. This study highlights the potential of combining 3DGS with multi-view UAV imaging for improved biomass phenotyping.
油菜生物量估算对于优化作物产量和育种策略至关重要。基于无人机的成像技术推动了高通量表型分析的发展,但目前的方法通常依赖于正射影像,在复杂的田间环境中,正射影像难以解决叶片重叠和结构信息不完整的问题。本研究将三维高斯拼接法(3DGS)与分段任意模型(SAM)相结合,对油菜进行精确的三维重建和生物量估算。利用无人机从 36 个角度拍摄的多视角斜面图像进行三维重建,并利用 SAM 模块加强点云分割。然后将分割后的点云转换为点云体积,并使用线性回归法将其与地面测量的生物量进行拟合。结果表明,3DGS(7 k 和 30 k 次迭代)具有很高的准确性,峰值信噪比(PSNR)分别为 27.43 和 29.53,训练时间分别为 7 和 49 分钟。这一性能超过了运动结构(SfM)和 mipmap 神经辐射场(Mip-NeRF),显示出卓越的效率。SAM 模块实现了很高的分割精度,其平均交集大于联合(mIoU)为 0.961,F1 分数为 0.980。此外,在对生物量提取模型进行比较后发现,点云体积模型最为精确,其判定系数 (R2) 为 0.976,均方根误差 (RMSE) 为 2.92 克/株,平均绝对百分比误差 (MAPE) 为 6.81%,优于地块作物体积模型和单个作物体积模型。这项研究强调了将 3DGS 与多视角无人机成像相结合改进生物量表型的潜力。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Computers and Electronics in Agriculture
Computers and Electronics in Agriculture 工程技术-计算机:跨学科应用
CiteScore
15.30
自引率
14.50%
发文量
800
审稿时长
62 days
期刊介绍: Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信