{"title":"数字园艺高手:监测水培植物生长的神经网络","authors":"M. Tenzer, N. Clifford","doi":"10.1109/SIEDS49339.2020.9106645","DOIUrl":null,"url":null,"abstract":"Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.","PeriodicalId":331495,"journal":{"name":"2020 Systems and Information Engineering Design Symposium (SIEDS)","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2020-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A Digital Green Thumb: Neural Networks to Monitor Hydroponic Plant Growth\",\"authors\":\"M. Tenzer, N. Clifford\",\"doi\":\"10.1109/SIEDS49339.2020.9106645\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.\",\"PeriodicalId\":331495,\"journal\":{\"name\":\"2020 Systems and Information Engineering Design Symposium (SIEDS)\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 Systems and Information Engineering Design Symposium (SIEDS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SIEDS49339.2020.9106645\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 Systems and Information Engineering Design Symposium (SIEDS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SIEDS49339.2020.9106645","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Digital Green Thumb: Neural Networks to Monitor Hydroponic Plant Growth
Hydroponics systems present an extraordinary opportunity to lessen the environmental impact of agriculture and increase access to fresh produce. Automated hydroponics systems contain many sensors to monitor plant growth and health, but recovering information about plant status is a non-trivial task; most methods require specialized camera hardware or extensive manually-annotated data. A common alternative is to simply take a photograph of plants using an ordinary digital camera and calculate the percentage of the image that is green, since previous research links this percentage very closely to plant biomass; however, this approach fails with anthocyanin-producing (purple) plants or night-vision (greyscale) imagery.We developed a data-driven approach with no requirement of manual annotation. For each of 20 distinct time series of green plant images, we calculated which pixels were green, a proxy for labeling which pixels were occupied by plant matter. We then converted all images to greyscale and trained convolutional neural networks, inspired by state-of-the-art object detection and image segmentation literature, to take a greyscale image and classify which pixels were originally green. We systematically compared several network architectures, including Unet, Linknet, FPN, and PSPNet, using a 34-layer ResNet architecture for the encoder and evaluated model performance by ten-fold cross-validation, training on 18 series and leaving out two per fold.We calculated cross-validated receiver operating characteristic (ROC) curves for each model and achieved a maximum validation-set area under the curve (AUC) over 0.92, after only ten epochs of training from randomly-initialized weights. Time series plots of the average per-pixel predicted probability (predicted percent greenness of an image) followed the true percentages quite closely but displayed much smoother and more interpretable trends, even when the true label was very noisy. The resulting plant growth index retains the power of the simpler percent-green metric, but by design generalizes to difficult images where green is completely absent. We have therefore developed a robust and deployable monitoring system for the growth of diverse plant species in automated hydroponics systems.