Unsupervised Image Segmentation using Convolutional Neural Networks for Automated Crop Monitoring

Prakruti V. Bhatt, Sanat Sarangi, S. Pappula
{"title":"Unsupervised Image Segmentation using Convolutional Neural Networks for Automated Crop Monitoring","authors":"Prakruti V. Bhatt, Sanat Sarangi, S. Pappula","doi":"10.5220/0007687508870893","DOIUrl":null,"url":null,"abstract":"Among endeavors towards automation in agriculture, localization and segmentation of various events during the growth cycle of a crop is critical and can be challenging in a dense foliage. Convolutional Neural Network based methods have been used to achieve state-of-the-art results in supervised image segmentation. In this paper, we investigate the unsupervised method of segmentation for monitoring crop growth and health conditions. Individual segments are then evaluated for their size, color, and texture in order to measure the possible change in the crop like emergence of a flower, fruit, deficiency, disease or pest. Supervised methods require ground truth labels of the segments in a large number of the images for training a neural network which can be used for similar kind of images on which the network is trained. Instead, we use information of spatial continuity in pixels and boundaries in a given image to update the feature representation and label assignment to every pixel using a fully convolutional network. Given that manual labeling of crop images is time consuming but quantifying an event occurrence in the farm is of utmost importance, our proposed approach achieves promising results on images of crops captured in different conditions. We obtained 94% accuracy in segmenting Cabbage with Black Moth pest, 81% in getting segments affected by Helopeltis pest on Tea leaves and 92% in spotting fruits on a Citrus tree where accuracy is defined in terms of intersection over union of the resulting segments with the ground truth. The resulting segments have been used for temporal crop monitoring and severity measurement in case of disease or pest manifestations.","PeriodicalId":410036,"journal":{"name":"International Conference on Pattern Recognition Applications and Methods","volume":"68 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Pattern Recognition Applications and Methods","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5220/0007687508870893","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Among endeavors towards automation in agriculture, localization and segmentation of various events during the growth cycle of a crop is critical and can be challenging in a dense foliage. Convolutional Neural Network based methods have been used to achieve state-of-the-art results in supervised image segmentation. In this paper, we investigate the unsupervised method of segmentation for monitoring crop growth and health conditions. Individual segments are then evaluated for their size, color, and texture in order to measure the possible change in the crop like emergence of a flower, fruit, deficiency, disease or pest. Supervised methods require ground truth labels of the segments in a large number of the images for training a neural network which can be used for similar kind of images on which the network is trained. Instead, we use information of spatial continuity in pixels and boundaries in a given image to update the feature representation and label assignment to every pixel using a fully convolutional network. Given that manual labeling of crop images is time consuming but quantifying an event occurrence in the farm is of utmost importance, our proposed approach achieves promising results on images of crops captured in different conditions. We obtained 94% accuracy in segmenting Cabbage with Black Moth pest, 81% in getting segments affected by Helopeltis pest on Tea leaves and 92% in spotting fruits on a Citrus tree where accuracy is defined in terms of intersection over union of the resulting segments with the ground truth. The resulting segments have been used for temporal crop monitoring and severity measurement in case of disease or pest manifestations.
基于卷积神经网络的无监督图像分割用于作物自动监测
在农业自动化的努力中,作物生长周期中各种事件的定位和分割至关重要,并且在茂密的叶子中可能具有挑战性。基于卷积神经网络的方法已被用于实现监督图像分割的最新结果。本文研究了一种用于作物生长和健康状况监测的无监督分割方法。然后对各个部分的大小、颜色和质地进行评估,以衡量作物可能发生的变化,如花、果、缺陷、疾病或害虫的出现。监督方法需要对大量图像中的片段进行地面真值标记,以训练神经网络,该神经网络可用于训练网络的类似类型的图像。相反,我们使用给定图像中像素和边界的空间连续性信息来使用全卷积网络更新每个像素的特征表示和标签分配。鉴于人工标记作物图像耗时,但量化农场中发生的事件至关重要,我们提出的方法在不同条件下捕获的作物图像上取得了令人满意的结果。我们在白菜与黑蛾害虫的分割中获得了94%的准确率,在茶叶上获得受Helopeltis害虫影响的片段中获得了81%的准确率,在柑橘树上发现水果时获得了92%的准确率,其中准确率是根据所得片段与地面真相的交集除以联合来定义的。所产生的片段已用于作物的时间监测和严重程度测量的情况下,疾病或虫害的表现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信