Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0023
Peter M Dracatos, Stefanie Lück, Dimitar K Douchkov
{"title":"Diversifying Resistance Mechanisms in Cereal Crops Using Microphenomics.","authors":"Peter M Dracatos, Stefanie Lück, Dimitar K Douchkov","doi":"10.34133/plantphenomics.0023","DOIUrl":"https://doi.org/10.34133/plantphenomics.0023","url":null,"abstract":"Whole-grain cereals, including wheat, barley, oat, rye, corn, rice, millet, and triticale, are a rich source of calories, essential vitamins, minerals, and phytochemicals that both nourish and protect humans and animals from diseases such as heart attack and cancer [1]. However, susceptibility to foliar diseases caused by necrotrophic or biotrophic fungal pathogens continues to reduce yield potential or lead to total crop failure and famine in developing nations [2]. Historically, foliar diseases of cereals have been controlled using either fungicide treatment [3] or plant breeding [4]. However, fungicides create selection pressure in favor of the emergence of insensitive pathogen variants and are both expensive and harmful to human health and the environment. In contrast, deploying disease resistance genes in improved cereal varieties has proven the most economical and environmentally sustainable approach to protect yield potential and ensure adequate quantities of pesticide-free food [5]. Plant genotypes or populations exposed to pathogenic microbes often vary in their phenotypic response or degree of infestation, and this is due to differences in inherited defenses. Plant breeders and geneticists are constantly assessing visual symptoms of disease resistance traits in their experimental material. Although these data are valuable, its reproducibility is limited due to individual scoring biases, non-quantitative inoculation methods, and environmental variables. Currently, there is an imbalance between our ability to manipulate plant genomes and our capacity to phenotype accurately for disease, creating a bottleneck in the plant breeding pipeline. Despite next-generation sequencing reducing genotyping costs by more than 100-fold in the last 20 years, the cost and inaccuracy of disease phenotyping have impeded genetic gain. Furthermore, the requirement of phenotypic accuracy for trait dissection and breeding has led to a historic selection bias towards race-specific resistance genes of major phenotypic effect that are easier to phenotype but rarely durable in agricultural settings when deployed singly, due to rapidly evolving pathogen populations [6] (Fig. 1A to D). In contrast, partial resistance (PR) refers to a reduction or delayed growth of the pathogen and is typically conditioned by several quantitatively inherited alleles (Fig. 1E). This form of resistance is durable, non-race-specific, and incomplete when considering the effect of a single locus in isolation [7]. However, the cumulative effect of several partial minor resistance loci often leads to complete immunity and has been reported to account for reduced disease epidemics. Both gene isolation and subsequent functional characterization studies in cereal crop plants demonstrate that PR genes show higher mechanistic diversification relative to the R genes explaining their durability [8–10]. For convenience, the term adult plant resistance (APR) is commonly used to describe PR observed i","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0023"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10076052/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9289637","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0076
Tobias Selzner, Jannis Horn, Magdalena Landl, Andreas Pohlmeier, Dirk Helmrich, Katrin Huber, Jan Vanderborght, Harry Vereecken, Sven Behnke, Andrea Schnepf
{"title":"3D U-Net Segmentation Improves Root System Reconstruction from 3D MRI Images in Automated and Manual Virtual Reality Work Flows.","authors":"Tobias Selzner, Jannis Horn, Magdalena Landl, Andreas Pohlmeier, Dirk Helmrich, Katrin Huber, Jan Vanderborght, Harry Vereecken, Sven Behnke, Andrea Schnepf","doi":"10.34133/plantphenomics.0076","DOIUrl":"https://doi.org/10.34133/plantphenomics.0076","url":null,"abstract":"<p><p>Magnetic resonance imaging (MRI) is used to image root systems grown in opaque soil. However, reconstruction of root system architecture (RSA) from 3-dimensional (3D) MRI images is challenging. Low resolution and poor contrast-to-noise ratios (CNRs) hinder automated reconstruction. Hence, manual reconstruction is still widely used. Here, we evaluate a novel 2-step work flow for automated RSA reconstruction. In the first step, a 3D U-Net segments MRI images into root and soil in super-resolution. In the second step, an automated tracing algorithm reconstructs the root systems from the segmented images. We evaluated the merits of both steps for an MRI dataset of 8 lupine root systems, by comparing the automated reconstructions to manual reconstructions of unaltered and segmented MRI images derived with a novel virtual reality system. We found that the U-Net segmentation offers profound benefits in manual reconstruction: reconstruction speed was doubled (+97%) for images with low CNR and increased by 27% for images with high CNR. Reconstructed root lengths were increased by 20% and 3%, respectively. Therefore, we propose to use U-Net segmentation as a principal image preprocessing step in manual work flows. The root length derived by the tracing algorithm was lower than in both manual reconstruction methods, but segmentation allowed automated processing of otherwise not readily usable MRI images. Nonetheless, model-based functional root traits revealed similar hydraulic behavior of automated and manual reconstructions. Future studies will aim to establish a hybrid work flow that utilizes automated reconstructions as scaffolds that can be manually corrected.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0076"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10381537/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9963329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0056
Kaili Yang, Jiacai Mo, Shanjun Luo, Yi Peng, Shenghui Fang, Xianting Wu, Renshan Zhu, Yuanjin Li, Ningge Yuan, Cong Zhou, Yan Gong
{"title":"Estimation of Rice Aboveground Biomass by UAV Imagery with Photosynthetic Accumulation Models.","authors":"Kaili Yang, Jiacai Mo, Shanjun Luo, Yi Peng, Shenghui Fang, Xianting Wu, Renshan Zhu, Yuanjin Li, Ningge Yuan, Cong Zhou, Yan Gong","doi":"10.34133/plantphenomics.0056","DOIUrl":"https://doi.org/10.34133/plantphenomics.0056","url":null,"abstract":"<p><p>The effective and accurate aboveground biomass (AGB) estimation facilitates evaluating crop growth and site-specific crop management. Considering that rice accumulates AGB mainly through green leaf photosynthesis, we proposed the photosynthetic accumulation model (PAM) and its simplified version and compared them for estimating AGB. These methods estimate the AGB of various rice cultivars throughout the growing season by integrating vegetation index (VI) and canopy height based on images acquired by unmanned aerial vehicles (UAV). The results indicated that the correlation of VI and AGB was weak for the whole growing season of rice and the accuracy of the height model was also limited for the whole growing season. In comparison with the NDVI-based rice AGB estimation model in 2019 data (<i>R</i><sup>2</sup> = 0.03, RMSE = 603.33 g/m<sup>2</sup>) and canopy height (<i>R</i><sup>2</sup> = 0.79, RMSE = 283.33 g/m<sup>2</sup>), the PAM calculated by NDVI and canopy height could provide a better estimate of AGB of rice (<i>R</i><sup>2</sup> = 0.95, RMSE = 136.81 g/m<sup>2</sup>). Then, based on the time-series analysis of the accumulative model, a simplified photosynthetic accumulation model (SPAM) was proposed that only needs limited observations to achieve <i>R</i><sup>2</sup> above 0.8. The PAM and SPAM models built by using 2 years of samples successfully predicted the third year of samples and also demonstrated the robustness and generalization ability of the models. In conclusion, these methods can be easily and efficiently applied to the UAV estimation of rice AGB over the entire growing season, which has great potential to serve for large-scale field management and also for breeding.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0056"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10238111/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9631567","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0068
Dominik Rößle, Lukas Prey, Ludwig Ramgraber, Anja Hanemann, Daniel Cremers, Patrick Ole Noack, Torsten Schön
{"title":"Efficient Noninvasive FHB Estimation using RGB Images from a Novel Multiyear, Multirater Dataset.","authors":"Dominik Rößle, Lukas Prey, Ludwig Ramgraber, Anja Hanemann, Daniel Cremers, Patrick Ole Noack, Torsten Schön","doi":"10.34133/plantphenomics.0068","DOIUrl":"https://doi.org/10.34133/plantphenomics.0068","url":null,"abstract":"<p><p><i>Fusarium</i> head blight (FHB) is one of the most prevalent wheat diseases, causing substantial yield losses and health risks. Efficient phenotyping of FHB is crucial for accelerating resistance breeding, but currently used methods are time-consuming and expensive. The present article suggests a noninvasive classification model for FHB severity estimation using red-green-blue (RGB) images, without requiring extensive preprocessing. The model accepts images taken from consumer-grade, low-cost RGB cameras and classifies the FHB severity into 6 ordinal levels. In addition, we introduce a novel dataset consisting of around 3,000 images from 3 different years (2020, 2021, and 2022) and 2 FHB severity assessments per image from independent raters. We used a pretrained EfficientNet (size b0), redesigned as a regression model. The results demonstrate that the interrater reliability (Cohen's kappa, <i>κ</i>) is substantially lower than the achieved individual network-to-rater results, e.g., 0.68 and 0.76 for the data captured in 2020, respectively. The model shows a generalization effect when trained with data from multiple years and tested on data from an independent year. Thus, using the images from 2020 and 2021 for training and 2022 for testing, we improved the <math><msubsup><mi>F</mi><mn>1</mn><mi>w</mi></msubsup></math> score by 0.14, the accuracy by 0.11, <i>κ</i> by 0.12, and reduced the root mean squared error by 0.5 compared to the best network trained only on a single year's data. The proposed lightweight model and methods could be deployed on mobile devices to automatically and objectively assess FHB severity with images from low-cost RGB cameras. The source code and the dataset are available at https://github.com/cvims/FHB_classification.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0068"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10348660/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9825412","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0013
Chenglong Huang, Zhongfu Zhang, Xiaojun Zhang, Li Jiang, Xiangdong Hua, Junli Ye, Wanneng Yang, Peng Song, Longfu Zhu
{"title":"A Novel Intelligent System for Dynamic Observation of Cotton Verticillium Wilt.","authors":"Chenglong Huang, Zhongfu Zhang, Xiaojun Zhang, Li Jiang, Xiangdong Hua, Junli Ye, Wanneng Yang, Peng Song, Longfu Zhu","doi":"10.34133/plantphenomics.0013","DOIUrl":"https://doi.org/10.34133/plantphenomics.0013","url":null,"abstract":"<p><p>Verticillium wilt is one of the most critical cotton diseases, which is widely distributed in cotton-producing countries. However, the conventional method of verticillium wilt investigation is still manual, which has the disadvantages of subjectivity and low efficiency. In this research, an intelligent vision-based system was proposed to dynamically observe cotton verticillium wilt with high accuracy and high throughput. Firstly, a 3-coordinate motion platform was designed with the movement range 6,100 mm × 950 mm × 500 mm, and a specific control unit was adopted to achieve accurate movement and automatic imaging. Secondly, the verticillium wilt recognition was established based on 6 deep learning models, in which the VarifocalNet (VFNet) model had the best performance with a mean average precision (<i>mAP</i>) of 0.932. Meanwhile, deformable convolution, deformable region of interest pooling, and soft non-maximum suppression optimization methods were adopted to improve VFNet, and the <i>mAP</i> of the VFNet-Improved model improved by 1.8%. The precision-recall curves showed that VFNet-Improved was superior to VFNet for each category and had a better improvement effect on the ill leaf category than fine leaf. The regression results showed that the system measurement based on VFNet-Improved achieved high consistency with manual measurements. Finally, the user software was designed based on VFNet-Improved, and the dynamic observation results proved that this system was able to accurately investigate cotton verticillium wilt and quantify the prevalence rate of different resistant varieties. In conclusion, this study has demonstrated a novel intelligent system for the dynamic observation of cotton verticillium wilt on the seedbed, which provides a feasible and effective tool for cotton breeding and disease resistance research.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0013"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10076053/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9282871","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0040
Pengyao Xie, Ruiming Du, Zhihong Ma, Haiyan Cen
{"title":"Generating 3D Multispectral Point Clouds of Plants with Fusion of Snapshot Spectral and RGB-D Images.","authors":"Pengyao Xie, Ruiming Du, Zhihong Ma, Haiyan Cen","doi":"10.34133/plantphenomics.0040","DOIUrl":"https://doi.org/10.34133/plantphenomics.0040","url":null,"abstract":"<p><p>Accurate and high-throughput plant phenotyping is important for accelerating crop breeding. Spectral imaging that can acquire both spectral and spatial information of plants related to structural, biochemical, and physiological traits becomes one of the popular phenotyping techniques. However, close-range spectral imaging of plants could be highly affected by the complex plant structure and illumination conditions, which becomes one of the main challenges for close-range plant phenotyping. In this study, we proposed a new method for generating high-quality plant 3-dimensional multispectral point clouds. Speeded-Up Robust Features and Demons was used for fusing depth and snapshot spectral images acquired at close range. A reflectance correction method for plant spectral images based on hemisphere references combined with artificial neural network was developed for eliminating the illumination effects. The proposed Speeded-Up Robust Features and Demons achieved an average structural similarity index measure of 0.931, outperforming the classic approaches with an average structural similarity index measure of 0.889 in RGB and snapshot spectral image registration. The distribution of digital number values of the references at different positions and orientations was simulated using artificial neural network with the determination coefficient (<i>R</i> <sup>2</sup>) of 0.962 and root mean squared error of 0.036. Compared with the ground truth measured by ASD spectrometer, the average root mean squared error of the reflectance spectra before and after reflectance correction at different leaf positions decreased by 78.0%. For the same leaf position, the average Euclidean distances between the multiview reflectance spectra decreased by 60.7%. Our results indicate that the proposed method achieves a good performance in generating plant 3-dimensional multispectral point clouds, which is promising for close-range plant phenotyping.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0040"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10069917/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9626998","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Drone-Based Harvest Data Prediction Can Reduce On-Farm Food Loss and Improve Farmer Income.","authors":"Haozhou Wang, Tang Li, Erika Nishida, Yoichiro Kato, Yuya Fukano, Wei Guo","doi":"10.34133/plantphenomics.0086","DOIUrl":"https://doi.org/10.34133/plantphenomics.0086","url":null,"abstract":"<p><p>On-farm food loss (i.e., grade-out vegetables) is a difficult challenge in sustainable agricultural systems. The simplest method to reduce the number of grade-out vegetables is to monitor and predict the size of all individuals in the vegetable field and determine the optimal harvest date with the smallest grade-out number and highest profit, which is not cost-effective by conventional methods. Here, we developed a full pipeline to accurately estimate and predict every broccoli head size (<i>n</i> > 3,000) automatically and nondestructively using drone remote sensing and image analysis. The individual sizes were fed to the temperature-based growth model and predicted the optimal harvesting date. Two years of field experiments revealed that our pipeline successfully estimated and predicted the head size of all broccolis with high accuracy. We also found that a deviation of only 1 to 2 days from the optimal date can considerably increase grade-out and reduce farmer's profits. This is an unequivocal demonstration of the utility of these approaches to economic crop optimization and minimization of food losses.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0086"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10484300/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10214408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0066
Qiushi Yu, Jingqi Wang, Hui Tang, Jiaxi Zhang, Wenjie Zhang, Liantao Liu, Nan Wang
{"title":"Application of Improved UNet and EnglightenGAN for Segmentation and Reconstruction of In Situ Roots.","authors":"Qiushi Yu, Jingqi Wang, Hui Tang, Jiaxi Zhang, Wenjie Zhang, Liantao Liu, Nan Wang","doi":"10.34133/plantphenomics.0066","DOIUrl":"https://doi.org/10.34133/plantphenomics.0066","url":null,"abstract":"<p><p>The root is an important organ for crops to absorb water and nutrients. Complete and accurate acquisition of root phenotype information is important in root phenomics research. The in situ root research method can obtain root images without destroying the roots. In the image, some of the roots are vulnerable to soil shading, which severely fractures the root system and diminishes its structural integrity. The methods of ensuring the integrity of in situ root identification and establishing in situ root image phenotypic restoration remain to be explored. Therefore, based on the in situ root image of cotton, this study proposes a root segmentation and reconstruction strategy, improves the UNet model, and achieves precise segmentation. It also adjusts the weight parameters of EnlightenGAN to achieve complete reconstruction and employs transfer learning to implement enhanced segmentation using the results of the former two. The research results show that the improved UNet model has an accuracy of 99.2%, mIOU of 87.03%, and F1 of 92.63%. The root reconstructed by EnlightenGAN after direct segmentation has an effective reconstruction ratio of 92.46%. This study enables a transition from supervised to unsupervised training of root system reconstruction by designing a combination strategy of segmentation and reconstruction network. It achieves the integrity restoration of in situ root system pictures and offers a fresh approach to studying the phenotypic of in situ root systems, also realizes the restoration of the integrity of the in situ root image, and provides a new method for in situ root phenotype study.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0066"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10325669/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9811989","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0031
Chaeyeong Yun, Yu Hwan Kim, Sung Jae Lee, Su Jin Im, Kang Ryoung Park
{"title":"WRA-Net: Wide Receptive Field Attention Network for Motion Deblurring in Crop and Weed Image.","authors":"Chaeyeong Yun, Yu Hwan Kim, Sung Jae Lee, Su Jin Im, Kang Ryoung Park","doi":"10.34133/plantphenomics.0031","DOIUrl":"https://doi.org/10.34133/plantphenomics.0031","url":null,"abstract":"<p><p>Automatically segmenting crops and weeds in the image input from cameras accurately is essential in various agricultural technology fields, such as herbicide spraying by farming robots based on crop and weed segmentation information. However, crop and weed images taken with a camera have motion blur due to various causes (e.g., vibration or shaking of a camera on farming robots, shaking of crops and weeds), which reduces the accuracy of crop and weed segmentation. Therefore, robust crop and weed segmentation for motion-blurred images is essential. However, previous crop and weed segmentation studies were performed without considering motion-blurred images. To solve this problem, this study proposed a new motion-blur image restoration method based on a wide receptive field attention network (WRA-Net), based on which we investigated improving crop and weed segmentation accuracy in motion-blurred images. WRA-Net comprises a main block called a lite wide receptive field attention residual block, which comprises modified depthwise separable convolutional blocks, an attention gate, and a learnable skip connection. We conducted experiments using the proposed method with 3 open databases: BoniRob, crop/weed field image, and rice seedling and weed datasets. According to the results, the crop and weed segmentation accuracy based on mean intersection over union was 0.7444, 0.7741, and 0.7149, respectively, demonstrating that this method outperformed the state-of-the-art methods.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0031"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10243196/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9596789","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Plant PhenomicsPub Date : 2023-01-01DOI: 10.34133/plantphenomics.0024
Jie Xu, Jia Yao, Hang Zhai, Qimeng Li, Qi Xu, Ying Xiang, Yaxi Liu, Tianhong Liu, Huili Ma, Yan Mao, Fengkai Wu, Qingjun Wang, Xuanjun Feng, Jiong Mu, Yanli Lu
{"title":"TrichomeYOLO: A Neural Network for Automatic Maize Trichome Counting.","authors":"Jie Xu, Jia Yao, Hang Zhai, Qimeng Li, Qi Xu, Ying Xiang, Yaxi Liu, Tianhong Liu, Huili Ma, Yan Mao, Fengkai Wu, Qingjun Wang, Xuanjun Feng, Jiong Mu, Yanli Lu","doi":"10.34133/plantphenomics.0024","DOIUrl":"https://doi.org/10.34133/plantphenomics.0024","url":null,"abstract":"<p><p>Plant trichomes are epidermal structures with a wide variety of functions in plant development and stress responses. Although the functional importance of trichomes has been realized, the tedious and time-consuming manual phenotyping process greatly limits the research progress of trichome gene cloning. Currently, there are no fully automated methods for identifying maize trichomes. We introduce TrichomeYOLO, an automated trichome counting and measuring method that uses a deep convolutional neural network, to identify the density and length of maize trichomes from scanning electron microscopy images. Our network achieved 92.1% identification accuracy on scanning electron microscopy micrographs of maize leaves, which is much better performed than the other 5 currently mainstream object detection models, Faster R-CNN, YOLOv3, YOLOv5, DETR, and Cascade R-CNN. We applied TrichomeYOLO to investigate trichome variations in a natural population of maize and achieved robust trichome identification. Our method and the pretrained model are open access in Github (https://github.com/yaober/trichomecounter). We believe TrichomeYOLO will help make efficient trichome identification and help facilitate researches on maize trichomes.</p>","PeriodicalId":20318,"journal":{"name":"Plant Phenomics","volume":"5 ","pages":"0024"},"PeriodicalIF":6.5,"publicationDate":"2023-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10013788/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"9507823","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}