Michael C. Tross, Marcin W. Grzybowski, T. Jubery, Ryleigh J. Grove, Aime Nishimwe, J. V. Torres-Rodríguez, Guangchao Sun, B. Ganapathysubramanian, Yufeng Ge, James c. Schnable
{"title":"Data driven discovery and quantification of hyperspectral leaf reflectance phenotypes across a maize diversity panel","authors":"Michael C. Tross, Marcin W. Grzybowski, T. Jubery, Ryleigh J. Grove, Aime Nishimwe, J. V. Torres-Rodríguez, Guangchao Sun, B. Ganapathysubramanian, Yufeng Ge, James c. Schnable","doi":"10.1002/ppj2.20106","DOIUrl":"https://doi.org/10.1002/ppj2.20106","url":null,"abstract":"Estimates of plant traits derived from hyperspectral reflectance data have the potential to efficiently substitute for traits, which are time or labor intensive to manually score. Typical workflows for estimating plant traits from hyperspectral reflectance data employ supervised classification models that can require substantial ground truth datasets for training. We explore the potential of an unsupervised approach, autoencoders, to extract meaningful traits from plant hyperspectral reflectance data using measurements of the reflectance of 2151 individual wavelengths of light from the leaves of maize (Zea mays) plants harvested from 1658 field plots in a replicated field trial. A subset of autoencoder‐derived variables exhibited significant repeatability, indicating that a substantial proportion of the total variance in these variables was explained by difference between maize genotypes, while other autoencoder variables appear to capture variation resulting from changes in leaf reflectance between different batches of data collection. Several of the repeatable latent variables were significantly correlated with other traits scored from the same maize field experiment, including one autoencoder‐derived latent variable (LV8) that predicted plant chlorophyll content modestly better than a supervised model trained on the same data. In at least one case, genome‐wide association study hits for variation in autoencoder‐derived variables were proximal to genes with known or plausible links to leaf phenotypes expected to alter hyperspectral reflectance. In aggregate, these results suggest that an unsupervised, autoencoder‐based approach can identify meaningful and genetically controlled variation in high‐dimensional, high‐throughput phenotyping data and link identified variables back to known plant traits of interest.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"46 3","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-06-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141377270","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Riley McConachie, Connor Belot, Mitra Serajazari, Helen Booker, John Sulik
{"title":"Estimating Fusarium head blight severity in winter wheat using deep learning and a spectral index","authors":"Riley McConachie, Connor Belot, Mitra Serajazari, Helen Booker, John Sulik","doi":"10.1002/ppj2.20103","DOIUrl":"https://doi.org/10.1002/ppj2.20103","url":null,"abstract":"Fusarium head blight (FHB) of wheat (Triticum aestivum L.), caused by the fungal pathogen Fusarium graminearum (Fg), reduces grain yield and quality due to the production of the mycotoxin deoxynivalenol. Manual rating for incidence (percent of infected wheat heads/spikes) and severity (percent of spikelets infected) to estimate FHB resistance is time‐consuming and subject to human error. This study uses a deep learning model, combined with a spectral index, to provide rapid phenotyping of FHB severity. An object detection model was used to localize wheat heads within boundary boxes. Corresponding boxes were used to prompt Meta's Segment Anything Model to segment wheat heads. Using 2576 images of wheat heads point inoculated with Fg in a controlled environment, a spectral index was developed using the red and green bands to differentiate healthy from infected tissue and estimate disease severity. Stratified random sampling was applied to pixels within the segmentation mask, and the model classified pixels as healthy or infected with an accuracy of 87.8%. Linear regression determined the relationship between the index and visual severity scores. The severity estimated by the index was able to predict visual scores (R2 = 0.83, p = < 2e‐16). This workflow was also applied to plot size images of infected wheat heads from an outside dataset with varying cultivars and lighting to assess model transferability. It correctly classified pixels as healthy or infected with a prediction accuracy of 85.8%. These methods may provide rapid estimation of FHB severity to improve selection efficiency for resistance or estimate disease pressure for effective management.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"52 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-05-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141112680","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ben Feuer, Ameya Joshi, Minsu Cho, Kewal Jani, Shivani Chiranjeevi, Ziwei Deng, Aditya Balu, Ashutosh Kumar Singh, S. Sarkar, Nirav C. Merchant, Arti Singh, B. Ganapathysubramanian, C. Hegde
{"title":"Zero‐shot insect detection via weak language supervision","authors":"Ben Feuer, Ameya Joshi, Minsu Cho, Kewal Jani, Shivani Chiranjeevi, Ziwei Deng, Aditya Balu, Ashutosh Kumar Singh, S. Sarkar, Nirav C. Merchant, Arti Singh, B. Ganapathysubramanian, C. Hegde","doi":"10.1002/ppj2.20107","DOIUrl":"https://doi.org/10.1002/ppj2.20107","url":null,"abstract":"Cheap and ubiquitous sensing has made collecting large agricultural datasets relatively straightforward. These large datasets (for instance, citizen science data curation platforms like iNaturalist) can pave the way for developing powerful artificial intelligence (AI) models for detection and counting. However, traditional supervised learning methods require labeled data, and manual annotation of these raw datasets with useful labels (such as bounding boxes or segmentation masks) can be extremely laborious, expensive, and error‐prone. In this paper, we demonstrate the power of zero‐shot computer vision methods—a new family of approaches that require (almost) no manual supervision—for plant phenomics applications. Focusing on insect detection as the primary use case, we show that our models enable highly accurate detection of insects in a variety of challenging imaging environments. Our technical contributions are two‐fold: (a) We curate the Insecta rank class of iNaturalist to form a new benchmark dataset of approximately 6 million images consisting of 2526 agriculturally and ecologically important species, including pests and beneficial insects. (b) Using a vision‐language object detection method coupled with weak language supervision, we are able to automatically annotate images in this dataset with bounding box information localizing the insect within each image. Our method succeeds in detecting diverse insect species present in a wide variety of backgrounds, producing high‐quality bounding boxes in a zero‐shot manner with no additional training cost. This open dataset can serve as a use‐inspired benchmark for the AI community. We demonstrate that our method can also be used for other applications in plant phenomics, such as fruit detection in images of strawberry and apple trees. Overall, our framework highlights the promise of zero‐shot approaches to make high‐throughput plant phenotyping more affordable.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"140 34","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-05-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141114354","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Erratum to: Estimation of the nutritive value of grasslands with the Yara N‐sensor field spectrometer","authors":"","doi":"10.1002/ppj2.20091","DOIUrl":"https://doi.org/10.1002/ppj2.20091","url":null,"abstract":"","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"53 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140235950","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Malcolm J. Morrison, A. Gahagan, T. Hotte, Hannah E. Morrison, Matthew Kenny, A. Saumure, Marc B. Lefevbre
{"title":"The Height Pole: Measuring plot height using a single‐point LiDAR sensor","authors":"Malcolm J. Morrison, A. Gahagan, T. Hotte, Hannah E. Morrison, Matthew Kenny, A. Saumure, Marc B. Lefevbre","doi":"10.1002/ppj2.20097","DOIUrl":"https://doi.org/10.1002/ppj2.20097","url":null,"abstract":"Plant canopy height is an essential trait for phenomics and plant breeding. Despite its importance, height is still largely measured by manual means with a ruler and notepad. Here, we present the Height Pole, a novel single‐point LiDAR (SPL)‐based instrument to measure and record plant and canopy height in the field quickly, reliably, and accurately. An SPL was mounted on the top of a pole and aimed downwards at an adjustable paddle that was positioned at the desired height. A custom app, written for Android OS, saved the plant height data from the SPL to a tablet. The Height Pole was tested against a ruler in the lab, in a field trial setting, and by multiple operators. Indoor and outdoor testing found no significant differences between a ruler and the Height Pole measurements. A test with five operators revealed that measuring, recording, transcribing, and digitizing were on average 20 s per plot slower with a ruler than with the Height Pole. The Height Pole required only one operator to measure and record data, reduced operator fatigue, and by directly writing the data to a .CSV file eliminated transcription errors. These improvements make it easier to collect crop height data on large experiments rapidly and accurately with low input costs.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"19 9","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140257835","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jennifer Lachowiec, Max J. Feldman, Filipe Inacio Matias, D. LeBauer, Alexander Gregory
{"title":"Adoption of unoccupied aerial systems in agricultural research","authors":"Jennifer Lachowiec, Max J. Feldman, Filipe Inacio Matias, D. LeBauer, Alexander Gregory","doi":"10.1002/ppj2.20098","DOIUrl":"https://doi.org/10.1002/ppj2.20098","url":null,"abstract":"A comprehensive survey and subject‐expert interviews conducted among agricultural researchers investigated perceived value and barriers to the adoption of unoccupied aerial systems (UASs) in agricultural research. These systems are often referred to colloquially as drones and are composed of unoccupied/uncrewed/unmanned vehicles and incorporated sensors. This study of UASs involved 154 respondents from 21 countries representing various agricultural sectors. The survey identified three key applications considered most promising for UASs in agriculture: precision agriculture, crop phenotyping/plant breeding, and crop modeling. Over 80% of respondents rated UASs for phenotyping as valuable, with 47.6% considering them very valuable. Among the participants, 41% were already using UAS technology in their research, while 49% expressed interest in future adoption. Current users highly valued UASs for phenotyping, with 63.9% considering them very valuable, compared to 39.4% of potential future users. The study also explored barriers to UAS adoption. The most commonly reported barriers were the “High cost of instruments/devices or software” (46.0%) and the “Lack of knowledge or trained personnel to analyze data” (40.9%). These barriers persisted as top concerns for both current and potential future users. Respondents expressed a desire for detailed step‐by‐step protocols for drone data processing pipelines (34.7%) and in‐person training for personnel (16.5%) as valuable resources for UAS adoption. The research sheds light on the prevailing perceptions and challenges associated with UAS usage in agricultural research, emphasizing the potential of UASs in specific applications and identifying crucial barriers to address for wider adoption in the agricultural sector.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"48 2","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-03-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140257474","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Ocident Bongomin, Jimmy Lamo, Joshua Mugeziaubwa Guina, Collins Okello, Gilbert Gilibrays Ocen, Morish Obura, Simon Alibu, Cynthia Awuor Owino, A. Akwero, Samson Ojok
{"title":"UAV image acquisition and processing for high‐throughput phenotyping in agricultural research and breeding programs","authors":"Ocident Bongomin, Jimmy Lamo, Joshua Mugeziaubwa Guina, Collins Okello, Gilbert Gilibrays Ocen, Morish Obura, Simon Alibu, Cynthia Awuor Owino, A. Akwero, Samson Ojok","doi":"10.1002/ppj2.20096","DOIUrl":"https://doi.org/10.1002/ppj2.20096","url":null,"abstract":"We are in a race against time to combat climate change and increase food production by 70% to feed the ever‐growing world population, which is expected to double by 2050. Agricultural research plays a vital role in improving crops and livestock through breeding programs and good agricultural practices, enabling sustainable agriculture and food systems. While advanced molecular breeding technologies have been widely adopted, phenotyping as an essential aspect of agricultural research and breeding programs has seen little development in most African institutions and remains a traditional method. However, the concept of high‐throughput phenotyping (HTP) has been gaining momentum, particularly in the context of unmanned aerial vehicle (UAV)‐based phenotyping. Although research into UAV‐based phenotyping is still limited, this paper aimed to provide a comprehensive overview and understanding of the use of UAV platforms and image analytics for HTP in agricultural research and to identify the key challenges and opportunities in this area. The paper discusses field phenotyping concepts, UAV classification and specifications, use cases of UAV‐based phenotyping, UAV imaging systems for phenotyping, and image processing and analytics methods. However, more research is required to optimize UAVs’ performance for image data acquisition, as limited studies have focused on the effect of UAVs’ operational parameters on data acquisition.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"26 5","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140450255","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Erik J. Amézquita, Michelle Y. Quigley, Patrick J. Brown, Elizabeth Munch, D. Chitwood
{"title":"Allometry and volumes in a nutshell: Analyzing walnut morphology using three‐dimensional X‐ray computed tomography","authors":"Erik J. Amézquita, Michelle Y. Quigley, Patrick J. Brown, Elizabeth Munch, D. Chitwood","doi":"10.1002/ppj2.20095","DOIUrl":"https://doi.org/10.1002/ppj2.20095","url":null,"abstract":"Persian walnuts (Juglans regia L.) are the second most produced and consumed tree nut, with over 2.6 million metric tons produced in the 2022–2023 harvest cycle alone. The United States is the second largest producer, accounting for 25% of the total global supply. Nonetheless, producers face an ever‐growing demand in a more uncertain climate landscape, which requires effective and efficient walnut selection and breeding of new cultivars with increased kernel content and easy‐to‐open shells. Past and current efforts select for these traits using hand‐held calipers and eye‐based evaluations. Yet there is plenty of morphology that meets the eye but goes unmeasured, such as the volume of inner air or the convexity of the kernel. Here, we study the shape of walnut fruits based on X‐ray computed tomography three‐dimensional reconstructions. We compute 49 different morphological phenotypes for 1264 individual nuts comprising 149 accessions. These phenotypes are complemented by traits of breeding interest such as ease of kernel removal and kernel‐to‐nut weight ratio. Through allometric relationships, relative growth of one tissue to another, we identify possible biophysical constraints at play during development. We explore multiple correlations between all morphological and commercial traits and identify which morphological traits can explain the most variability of commercial traits. We show that using only volume‐ and thickness‐based traits, especially inner air content, we can successfully encode several of the commercial traits.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"89 ","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140449560","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Flavian Tschurr, Corina Oppliger, Samuel E. Wuest, N. Kirchgessner, Achim Walter
{"title":"Erratum to: Mixing things up! Identifying early diversity benefits and facilitating the development of improved variety mixtures with high throughput field phenotyping","authors":"Flavian Tschurr, Corina Oppliger, Samuel E. Wuest, N. Kirchgessner, Achim Walter","doi":"10.1002/ppj2.20093","DOIUrl":"https://doi.org/10.1002/ppj2.20093","url":null,"abstract":"","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"28 13","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140482375","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Travis Banet, Abraham George Smith, Rebecca K. McGrail, D. McNear, Hanna J. Poffenbarger
{"title":"Toward improved image‐based root phenotyping: Handling temporal and cross‐site domain shifts in crop root segmentation models","authors":"Travis Banet, Abraham George Smith, Rebecca K. McGrail, D. McNear, Hanna J. Poffenbarger","doi":"10.1002/ppj2.20094","DOIUrl":"https://doi.org/10.1002/ppj2.20094","url":null,"abstract":"Crop root segmentation models developed through deep learning have increased the throughput of in situ crop phenotyping studies. However, models trained to identify roots in one image dataset may not accurately identify roots in another dataset, especially when the new dataset contains known differences, called domain shifts. The objective of this study was to quantify how model performance changes when models are used to segment image datasets that contain domain shifts and evaluate approaches to reduce error associated with domain shifts. We collected maize root images at two growth stages (V7 and R2) in a field experiment and manually segmented images to measure total root length (TRL). We developed five segmentation models and evaluated each model's ability to handle a temporal (growth‐stage) domain shift. For the V7 growth stage, a growth‐stage‐specific model trained only on images captured at the V7 growth stage was best suited for measuring TRL. At the R2 growth stage, combining images from both growth stages into a single dataset to train a model resulted in the most accurate TRL measurements. We applied two of the field models to images from a greenhouse experiment to evaluate how model performance changed when exposed to a cross‐site domain shift. Field models were less accurate than models trained only on the greenhouse images even when crop growth stage was identical. Although models may perform well for one experiment, model error increases when applied to images from different experiments even when crop species, growth stage, and soil type are similar.","PeriodicalId":504448,"journal":{"name":"The Plant Phenome Journal","volume":"394 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2024-01-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"140482834","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}