Xiaoyu Zhi , Qiaomin Chen , Yingchun Han , Beifang Yang , Yaru Wang , Fengqi Wu , Shiwu Xiong , Yahui Jiao , Yunzhen Ma , Shilong Shang , Tao Lin , Yaping Lei , Yabing Li
{"title":"Multi-modal feature integration from UAV-RGB imagery for high-precision cotton phenotyping: A paradigm shift toward cost-effective agricultural remote sensing","authors":"Xiaoyu Zhi , Qiaomin Chen , Yingchun Han , Beifang Yang , Yaru Wang , Fengqi Wu , Shiwu Xiong , Yahui Jiao , Yunzhen Ma , Shilong Shang , Tao Lin , Yaping Lei , Yabing Li","doi":"10.1016/j.compag.2025.111002","DOIUrl":null,"url":null,"abstract":"<div><div>Cost-effective remote sensing solutions are critically needed to democratize precision agriculture technologies. While hyperspectral and LiDAR systems deliver high accuracy, their prohibitive costs limit widespread adoption. This study demonstrates that systematic multi-modal feature integration transforms standard UAV-based RGB imagery into a powerful phenotyping instrument, achieving crop trait prediction accuracy comparable to systems costing 10–50 times more. We developed a comprehensive framework integrating spectral indices, geometric parameters, and texture metrics from commodity RGB sensors to predict five critical cotton traits: leaf area index (LAI), intercepted photosynthetically active radiation (IPAR), above-ground biomass, lint yield, and seed cotton yield. The progressive integration approach employed Random Forest regression with four feature configurations: baseline color indices (CI<sub>base</sub>), refined color indices (CI<sub>ref</sub>), geometric parameters (CI<sub>ref</sub> + GP), and texture metrics (CI<sub>ref</sub> + GP + T). Field experiments across three trials over two growing seasons (2022–2023) with varying genotypes, planting densities, and sowing dates provided 2,126 ground truth measurements for model development and validation. The optimal multi-modal model achieved R<sup>2</sup> = 0.97 for IPAR (rRMSE = 6 %), R<sup>2</sup> = 0.91 for LAI (rRMSE = 15 %), and R<sup>2</sup> = 0.85 for biomass (rRMSE = 32 %), with lint yield and seed cotton yield demonstrating R<sup>2</sup> values of 0.92 and 0.77, respectively. Variance partitioning analysis revealed texture features as the dominant contributor (16.2 % ± 7.1 %), followed by spectral indices (9.1 % ± 4.2 %) and geometric parameters (8.0 % ± 2.8 %), with substantial shared variance (45–65 %) indicating strong feature complementarity. Phenological analysis demonstrated that flowering-stage imagery outperformed boll opening stage measurements, while stage-general models showed superior robustness. Cross-temporal validation confirmed model generalizability, with trial-general models achieving R<sup>2</sup> values of 0.91–0.97 for IPAR across diverse environmental conditions. The framework enables sub-meter spatial resolution trait mapping while maintaining operational simplicity and cost-effectiveness, demonstrating that systematic feature engineering can democratize high-precision phenotyping technologies for broader agricultural applications.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"239 ","pages":"Article 111002"},"PeriodicalIF":8.9000,"publicationDate":"2025-09-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169925011081","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0
Abstract
Cost-effective remote sensing solutions are critically needed to democratize precision agriculture technologies. While hyperspectral and LiDAR systems deliver high accuracy, their prohibitive costs limit widespread adoption. This study demonstrates that systematic multi-modal feature integration transforms standard UAV-based RGB imagery into a powerful phenotyping instrument, achieving crop trait prediction accuracy comparable to systems costing 10–50 times more. We developed a comprehensive framework integrating spectral indices, geometric parameters, and texture metrics from commodity RGB sensors to predict five critical cotton traits: leaf area index (LAI), intercepted photosynthetically active radiation (IPAR), above-ground biomass, lint yield, and seed cotton yield. The progressive integration approach employed Random Forest regression with four feature configurations: baseline color indices (CIbase), refined color indices (CIref), geometric parameters (CIref + GP), and texture metrics (CIref + GP + T). Field experiments across three trials over two growing seasons (2022–2023) with varying genotypes, planting densities, and sowing dates provided 2,126 ground truth measurements for model development and validation. The optimal multi-modal model achieved R2 = 0.97 for IPAR (rRMSE = 6 %), R2 = 0.91 for LAI (rRMSE = 15 %), and R2 = 0.85 for biomass (rRMSE = 32 %), with lint yield and seed cotton yield demonstrating R2 values of 0.92 and 0.77, respectively. Variance partitioning analysis revealed texture features as the dominant contributor (16.2 % ± 7.1 %), followed by spectral indices (9.1 % ± 4.2 %) and geometric parameters (8.0 % ± 2.8 %), with substantial shared variance (45–65 %) indicating strong feature complementarity. Phenological analysis demonstrated that flowering-stage imagery outperformed boll opening stage measurements, while stage-general models showed superior robustness. Cross-temporal validation confirmed model generalizability, with trial-general models achieving R2 values of 0.91–0.97 for IPAR across diverse environmental conditions. The framework enables sub-meter spatial resolution trait mapping while maintaining operational simplicity and cost-effectiveness, demonstrating that systematic feature engineering can democratize high-precision phenotyping technologies for broader agricultural applications.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.