Wheat Leaf Disease Detection: A Lightweight Approach with Shallow CNN Based Feature Refinement

Oumayma Jouini, Mohamed Ould-Elhassen Aoueileyine, K. Sethom, Anis Yazidi
{"title":"Wheat Leaf Disease Detection: A Lightweight Approach with Shallow CNN Based Feature Refinement","authors":"Oumayma Jouini, Mohamed Ould-Elhassen Aoueileyine, K. Sethom, Anis Yazidi","doi":"10.3390/agriengineering6030117","DOIUrl":null,"url":null,"abstract":"Improving agricultural productivity is essential due to rapid population growth, making early detection of crop diseases crucial. Although deep learning shows promise in smart agriculture, practical applications for identifying wheat diseases in complex backgrounds are limited. In this paper, we propose CropNet, a hybrid method that utilizes Red, Green, and Blue (RGB) imaging and a transfer learning approach combined with shallow convolutional neural networks (CNN) for further feature refinement. To develop our customized model, we conducted an extensive search for the optimal deep learning architecture. Our approach involves freezing the pre-trained model for feature extraction and adding a custom trainable CNN layer. Unlike traditional transfer learning, which typically uses trainable dense layers, our method integrates a trainable CNN, deepening the architecture. We argue that pre-trained features in transfer learning are better suited for a custom shallow CNN followed by a fully connected layer, rather than being fed directly into fully connected layers. We tested various architectures for pre-trained models including EfficientNetB0 and B2, DenseNet, ResNet50, MobileNetV2, MobileNetV3-Small, and Inceptionv3. Our approach combines the strengths of pre-trained models with the flexibility of custom architecture design, offering efficiency, effective feature extraction, customization options, reduced overfitting, and differential learning rates. It distinguishes itself from classical transfer learning techniques, which typically fine-tune the entire pre-trained network. Our aim is to provide a lightweight model suitable for resource-constrained environments, capable of delivering outstanding results. CropNet achieved 99.80% accuracy in wheat disease detection with reduced training time and computational cost. This efficient performance makes CropNet promising for practical implementation in resource-constrained agricultural settings, benefiting farmers and enhancing production.","PeriodicalId":505370,"journal":{"name":"AgriEngineering","volume":"35 6","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"AgriEngineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/agriengineering6030117","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Improving agricultural productivity is essential due to rapid population growth, making early detection of crop diseases crucial. Although deep learning shows promise in smart agriculture, practical applications for identifying wheat diseases in complex backgrounds are limited. In this paper, we propose CropNet, a hybrid method that utilizes Red, Green, and Blue (RGB) imaging and a transfer learning approach combined with shallow convolutional neural networks (CNN) for further feature refinement. To develop our customized model, we conducted an extensive search for the optimal deep learning architecture. Our approach involves freezing the pre-trained model for feature extraction and adding a custom trainable CNN layer. Unlike traditional transfer learning, which typically uses trainable dense layers, our method integrates a trainable CNN, deepening the architecture. We argue that pre-trained features in transfer learning are better suited for a custom shallow CNN followed by a fully connected layer, rather than being fed directly into fully connected layers. We tested various architectures for pre-trained models including EfficientNetB0 and B2, DenseNet, ResNet50, MobileNetV2, MobileNetV3-Small, and Inceptionv3. Our approach combines the strengths of pre-trained models with the flexibility of custom architecture design, offering efficiency, effective feature extraction, customization options, reduced overfitting, and differential learning rates. It distinguishes itself from classical transfer learning techniques, which typically fine-tune the entire pre-trained network. Our aim is to provide a lightweight model suitable for resource-constrained environments, capable of delivering outstanding results. CropNet achieved 99.80% accuracy in wheat disease detection with reduced training time and computational cost. This efficient performance makes CropNet promising for practical implementation in resource-constrained agricultural settings, benefiting farmers and enhancing production.
小麦叶病检测:基于浅层 CNN 特征细化的轻量级方法
由于人口的快速增长,提高农业生产率至关重要,因此作物病害的早期检测至关重要。虽然深度学习在智能农业领域大有可为,但在复杂背景下识别小麦病害的实际应用却很有限。本文提出的 CropNet 是一种混合方法,它利用红、绿、蓝(RGB)成像和迁移学习方法,结合浅层卷积神经网络(CNN)来进一步完善特征。为了开发定制模型,我们对最佳深度学习架构进行了广泛的搜索。我们的方法包括冻结用于特征提取的预训练模型,并添加一个定制的可训练 CNN 层。与通常使用可训练密集层的传统迁移学习不同,我们的方法集成了一个可训练的 CNN,从而深化了架构。我们认为,在迁移学习中,预训练的特征更适合自定义浅层 CNN,然后是全连接层,而不是直接输入全连接层。我们测试了预训练模型的各种架构,包括 EfficientNetB0 和 B2、DenseNet、ResNet50、MobileNetV2、MobileNetV3-Small 和 Inceptionv3。我们的方法结合了预训练模型的优势和定制架构设计的灵活性,提供了高效率、有效的特征提取、定制选项、减少过拟合和不同的学习率。它有别于传统的迁移学习技术,后者通常会对整个预训练网络进行微调。我们的目标是提供一种轻量级模型,适用于资源有限的环境,并能提供出色的结果。CropNet 在小麦病害检测中达到了 99.80% 的准确率,同时减少了训练时间和计算成本。这种高效的性能使 CropNet 有希望在资源有限的农业环境中得到实际应用,从而使农民受益并提高产量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信