Ivan Roy S. Evangelista, M. Cabatuan, Lorelyn Joy T. Milagrosa, A. Bandala, Ronnie S. Concepcion, Elmer P. Dadios
{"title":"Zea Mays Multi-Disease Classification and Severity Assessment with EfficientNetV2 Variants","authors":"Ivan Roy S. Evangelista, M. Cabatuan, Lorelyn Joy T. Milagrosa, A. Bandala, Ronnie S. Concepcion, Elmer P. Dadios","doi":"10.1109/TENSYMP55890.2023.10223621","DOIUrl":null,"url":null,"abstract":"Despite the advances achieved in crop protection and management, crop diseases remain a problem for corn farmers. Numerous studies have presented the efficacy of corn disease detection and classification using machine learning-based vision detectors. However, many of these models rely on datasets with lab-based images that do not depict the real-world accurately. In this study, a vision-based corn disease detector and classifier is developed with EfficientNetV2 as base architecture. In addition, an algorithm for evaluating the severity of the disease has also been created. Two datasets were built to train the models, the common corn diseases (CCD) and corn disease severity (CDS). The EfficientNetV2-B0, B1, B2, B3, and S models, pre-trained on ImageN et, were explored for feature extraction. A custom classifier head is incorporated into the EfficientNetV2-based model to complete the architecture. It consists of a single convolutional neural network (CNN) and two fully connected layers. Transfer learning and fine-tuning were employed to improve the performance. The models were evaluated based on accuracy, cross-entropy loss, precision, recall, and F1-score. The EfficientNetV2B2 model performed best on the disease classification task, with an accuracy of 95.74%. The EfficientNetV2B3 is the top performer in the disease severity assessment task, with an accuracy of 98.73%. The EfficientNetV2S also surpassed other proposed models in PlantVillage (PV) dataset with an accuracy of 99.52%.","PeriodicalId":314726,"journal":{"name":"2023 IEEE Region 10 Symposium (TENSYMP)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Region 10 Symposium (TENSYMP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TENSYMP55890.2023.10223621","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Despite the advances achieved in crop protection and management, crop diseases remain a problem for corn farmers. Numerous studies have presented the efficacy of corn disease detection and classification using machine learning-based vision detectors. However, many of these models rely on datasets with lab-based images that do not depict the real-world accurately. In this study, a vision-based corn disease detector and classifier is developed with EfficientNetV2 as base architecture. In addition, an algorithm for evaluating the severity of the disease has also been created. Two datasets were built to train the models, the common corn diseases (CCD) and corn disease severity (CDS). The EfficientNetV2-B0, B1, B2, B3, and S models, pre-trained on ImageN et, were explored for feature extraction. A custom classifier head is incorporated into the EfficientNetV2-based model to complete the architecture. It consists of a single convolutional neural network (CNN) and two fully connected layers. Transfer learning and fine-tuning were employed to improve the performance. The models were evaluated based on accuracy, cross-entropy loss, precision, recall, and F1-score. The EfficientNetV2B2 model performed best on the disease classification task, with an accuracy of 95.74%. The EfficientNetV2B3 is the top performer in the disease severity assessment task, with an accuracy of 98.73%. The EfficientNetV2S also surpassed other proposed models in PlantVillage (PV) dataset with an accuracy of 99.52%.