利用时空融合从近地表 RGB 图像系列推导小麦物候的深度学习方法。

IF 4.7 2区 生物学 Q1 BIOCHEMICAL RESEARCH METHODS
Yucheng Cai, Yan Li, Xuerui Qi, Jianqing Zhao, Li Jiang, Yongchao Tian, Yan Zhu, Weixing Cao, Xiaohu Zhang
{"title":"利用时空融合从近地表 RGB 图像系列推导小麦物候的深度学习方法。","authors":"Yucheng Cai, Yan Li, Xuerui Qi, Jianqing Zhao, Li Jiang, Yongchao Tian, Yan Zhu, Weixing Cao, Xiaohu Zhang","doi":"10.1186/s13007-024-01278-0","DOIUrl":null,"url":null,"abstract":"<p><p>Accurate monitoring of wheat phenological stages is essential for effective crop management and informed agricultural decision-making. Traditional methods often rely on labour-intensive field surveys, which are prone to subjective bias and limited temporal resolution. To address these challenges, this study explores the potential of near-surface cameras combined with an advanced deep-learning approach to derive wheat phenological stages from high-quality, real-time RGB image series. Three deep learning models based on three different spatiotemporal feature fusion methods, namely sequential fusion, synchronous fusion, and parallel fusion, were constructed and evaluated for deriving wheat phenological stages with these near-surface RGB image series. Moreover, the impact of different image resolutions, capture perspectives, and model training strategies on the performance of deep learning models was also investigated. The results indicate that the model using the sequential fusion method is optimal, with an overall accuracy (OA) of 0.935, a mean absolute error (MAE) of 0.069, F1-score (F1) of 0.936, and kappa coefficients (Kappa) of 0.924 in wheat phenological stages. Besides, the enhanced image resolution of 512 × 512 pixels and a suitable image capture perspective, specifically a sensor viewing angle of 40° to 60° vertically, introduce more effective features for phenological stage detection, thereby enhancing the model's accuracy. Furthermore, concerning the model training, applying a two-step fine-tuning strategy will also enhance the model's robustness to random variations in perspective. This research introduces an innovative approach for real-time phenological stage detection and provides a solid foundation for precision agriculture. By accurately deriving critical phenological stages, the methodology developed in this study supports the optimization of crop management practices, which may result in improved resource efficiency and sustainability across diverse agricultural settings. The implications of this work extend beyond wheat, offering a scalable solution that can be adapted to monitor other crops, thereby contributing to more efficient and sustainable agricultural systems.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"20 1","pages":"153"},"PeriodicalIF":4.7000,"publicationDate":"2024-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443927/pdf/","citationCount":"0","resultStr":"{\"title\":\"A deep learning approach for deriving wheat phenology from near-surface RGB image series using spatiotemporal fusion.\",\"authors\":\"Yucheng Cai, Yan Li, Xuerui Qi, Jianqing Zhao, Li Jiang, Yongchao Tian, Yan Zhu, Weixing Cao, Xiaohu Zhang\",\"doi\":\"10.1186/s13007-024-01278-0\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Accurate monitoring of wheat phenological stages is essential for effective crop management and informed agricultural decision-making. Traditional methods often rely on labour-intensive field surveys, which are prone to subjective bias and limited temporal resolution. To address these challenges, this study explores the potential of near-surface cameras combined with an advanced deep-learning approach to derive wheat phenological stages from high-quality, real-time RGB image series. Three deep learning models based on three different spatiotemporal feature fusion methods, namely sequential fusion, synchronous fusion, and parallel fusion, were constructed and evaluated for deriving wheat phenological stages with these near-surface RGB image series. Moreover, the impact of different image resolutions, capture perspectives, and model training strategies on the performance of deep learning models was also investigated. The results indicate that the model using the sequential fusion method is optimal, with an overall accuracy (OA) of 0.935, a mean absolute error (MAE) of 0.069, F1-score (F1) of 0.936, and kappa coefficients (Kappa) of 0.924 in wheat phenological stages. Besides, the enhanced image resolution of 512 × 512 pixels and a suitable image capture perspective, specifically a sensor viewing angle of 40° to 60° vertically, introduce more effective features for phenological stage detection, thereby enhancing the model's accuracy. Furthermore, concerning the model training, applying a two-step fine-tuning strategy will also enhance the model's robustness to random variations in perspective. This research introduces an innovative approach for real-time phenological stage detection and provides a solid foundation for precision agriculture. By accurately deriving critical phenological stages, the methodology developed in this study supports the optimization of crop management practices, which may result in improved resource efficiency and sustainability across diverse agricultural settings. The implications of this work extend beyond wheat, offering a scalable solution that can be adapted to monitor other crops, thereby contributing to more efficient and sustainable agricultural systems.</p>\",\"PeriodicalId\":20100,\"journal\":{\"name\":\"Plant Methods\",\"volume\":\"20 1\",\"pages\":\"153\"},\"PeriodicalIF\":4.7000,\"publicationDate\":\"2024-09-30\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443927/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Plant Methods\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://doi.org/10.1186/s13007-024-01278-0\",\"RegionNum\":2,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOCHEMICAL RESEARCH METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Plant Methods","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1186/s13007-024-01278-0","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
引用次数: 0

摘要

准确监测小麦物候期对于有效管理作物和做出明智的农业决策至关重要。传统方法往往依赖于劳动密集型的实地调查,容易产生主观偏差,而且时间分辨率有限。为了应对这些挑战,本研究探索了近地表相机与先进的深度学习方法相结合的潜力,以从高质量的实时 RGB 图像系列中推导出小麦物候期。基于三种不同的时空特征融合方法(即顺序融合、同步融合和并行融合)构建了三种深度学习模型,并对其进行了评估,以利用这些近地表 RGB 图像系列推导出小麦物候期。此外,还研究了不同图像分辨率、拍摄角度和模型训练策略对深度学习模型性能的影响。结果表明,在小麦物候阶段,使用顺序融合方法的模型是最佳的,其总体准确率(OA)为 0.935,平均绝对误差(MAE)为 0.069,F1 分数(F1)为 0.936,卡帕系数(Kappa)为 0.924。此外,512 × 512 像素的增强图像分辨率和合适的图像捕捉视角,特别是传感器垂直视角为 40° 至 60°,为物候期检测引入了更有效的特征,从而提高了模型的准确性。此外,在模型训练方面,采用两步微调策略也能增强模型对随机视角变化的鲁棒性。这项研究引入了一种实时物候期检测的创新方法,为精准农业奠定了坚实的基础。通过准确推导关键物候期,本研究开发的方法有助于优化作物管理实践,从而在不同的农业环境中提高资源效率和可持续性。这项工作的意义不仅限于小麦,它还提供了一种可扩展的解决方案,可用于监测其他作物,从而有助于提高农业系统的效率和可持续性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A deep learning approach for deriving wheat phenology from near-surface RGB image series using spatiotemporal fusion.

Accurate monitoring of wheat phenological stages is essential for effective crop management and informed agricultural decision-making. Traditional methods often rely on labour-intensive field surveys, which are prone to subjective bias and limited temporal resolution. To address these challenges, this study explores the potential of near-surface cameras combined with an advanced deep-learning approach to derive wheat phenological stages from high-quality, real-time RGB image series. Three deep learning models based on three different spatiotemporal feature fusion methods, namely sequential fusion, synchronous fusion, and parallel fusion, were constructed and evaluated for deriving wheat phenological stages with these near-surface RGB image series. Moreover, the impact of different image resolutions, capture perspectives, and model training strategies on the performance of deep learning models was also investigated. The results indicate that the model using the sequential fusion method is optimal, with an overall accuracy (OA) of 0.935, a mean absolute error (MAE) of 0.069, F1-score (F1) of 0.936, and kappa coefficients (Kappa) of 0.924 in wheat phenological stages. Besides, the enhanced image resolution of 512 × 512 pixels and a suitable image capture perspective, specifically a sensor viewing angle of 40° to 60° vertically, introduce more effective features for phenological stage detection, thereby enhancing the model's accuracy. Furthermore, concerning the model training, applying a two-step fine-tuning strategy will also enhance the model's robustness to random variations in perspective. This research introduces an innovative approach for real-time phenological stage detection and provides a solid foundation for precision agriculture. By accurately deriving critical phenological stages, the methodology developed in this study supports the optimization of crop management practices, which may result in improved resource efficiency and sustainability across diverse agricultural settings. The implications of this work extend beyond wheat, offering a scalable solution that can be adapted to monitor other crops, thereby contributing to more efficient and sustainable agricultural systems.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Plant Methods
Plant Methods 生物-植物科学
CiteScore
9.20
自引率
3.90%
发文量
121
审稿时长
2 months
期刊介绍: Plant Methods is an open access, peer-reviewed, online journal for the plant research community that encompasses all aspects of technological innovation in the plant sciences. There is no doubt that we have entered an exciting new era in plant biology. The completion of the Arabidopsis genome sequence, and the rapid progress being made in other plant genomics projects are providing unparalleled opportunities for progress in all areas of plant science. Nevertheless, enormous challenges lie ahead if we are to understand the function of every gene in the genome, and how the individual parts work together to make the whole organism. Achieving these goals will require an unprecedented collaborative effort, combining high-throughput, system-wide technologies with more focused approaches that integrate traditional disciplines such as cell biology, biochemistry and molecular genetics. Technological innovation is probably the most important catalyst for progress in any scientific discipline. Plant Methods’ goal is to stimulate the development and adoption of new and improved techniques and research tools and, where appropriate, to promote consistency of methodologies for better integration of data from different laboratories.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信