数字园艺高手:监测水培植物生长的神经网络

M. Tenzer, N. Clifford
{"title":"数字园艺高手:监测水培植物生长的神经网络","authors":"M. Tenzer, N. Clifford","doi":"10.1109/SIEDS49339.2020.9106645","DOIUrl":null,"url":null,"abstract":"Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.","PeriodicalId":331495,"journal":{"name":"2020 Systems and Information Engineering Design Symposium (SIEDS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A Digital Green Thumb: Neural Networks to Monitor Hydroponic Plant Growth\",\"authors\":\"M. Tenzer, N. Clifford\",\"doi\":\"10.1109/SIEDS49339.2020.9106645\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.\",\"PeriodicalId\":331495,\"journal\":{\"name\":\"2020 Systems and Information Engineering Design Symposium (SIEDS)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 Systems and Information Engineering Design Symposium (SIEDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SIEDS49339.2020.9106645\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Systems and Information Engineering Design Symposium (SIEDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIEDS49339.2020.9106645","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2

摘要

水培系统提供了一个非凡的机会,以减少农业对环境的影响,增加获得新鲜农产品的机会。自动化水培系统包含许多传感器来监测植物的生长和健康,但恢复植物状态的信息是一项艰巨的任务;大多数方法需要专门的相机硬件或大量手动注释的数据。一种常见的替代方法是简单地用普通数码相机拍摄植物照片,然后计算图像中绿色的百分比,因为之前的研究将这一百分比与植物生物量密切联系在一起;然而,这种方法在产生花青素(紫色)的植物或夜视(灰度)图像中失败了。我们开发了一种不需要手动注释的数据驱动方法。对于20个不同时间序列的绿色植物图像,我们计算出哪些像素是绿色的,作为标记哪些像素被植物物质占据的代理。然后,我们将所有图像转换为灰度并训练卷积神经网络,受到最先进的物体检测和图像分割文献的启发,获取灰度图像并分类哪些像素最初是绿色的。我们系统地比较了几种网络架构,包括Unet、Linknet、FPN和PSPNet,使用34层ResNet架构作为编码器,并通过10倍交叉验证评估模型性能,在18个系列上进行训练,每隔两倍。我们计算了每个模型的交叉验证的接收者工作特征(ROC)曲线,并在随机初始化权重的10次训练后,实现了曲线下的最大验证集面积(AUC)超过0.92。平均每像素预测概率(预测图像的绿度百分比)的时间序列图非常接近真实百分比,但显示出更平滑和更可解释的趋势,即使真实标签非常嘈杂。由此产生的植物生长指数保留了更简单的绿色百分比度量的力量,但通过设计可以推广到完全没有绿色的困难图像。因此,我们开发了一个强大的、可部署的监测系统,用于自动化水培系统中不同植物物种的生长。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Digital Green Thumb: Neural Networks to Monitor Hydroponic Plant Growth
Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信