Muhammad Rizwan, Samina Bibi, S. Haq, Muhammad Asif, Tariqullah Jan, M. H. Zafar
{"title":"Automatic plant disease detection using computationally efficient convolutional neural network","authors":"Muhammad Rizwan, Samina Bibi, S. Haq, Muhammad Asif, Tariqullah Jan, M. H. Zafar","doi":"10.1002/eng2.12944","DOIUrl":null,"url":null,"abstract":"Agricultural plants are the fundamental source of nutrients worldwide. The attack of diseases on these plants leads to food scarcity and results in a catastrophic situation. These diseases can be prevented by using manual or automatic approaches. The manual approach, where plant pathologists inspect fields, is costly, error‐prone, and time‐consuming. Alternatively, automatic approaches utilize 2D plant images processed through machine learning. The current study opts for the later approach due to its advantages in terms of speed, efficiency, and convenience. Convolutional neural network (CNN)‐based prominent models, such as MobileNet, ResNet50, Inception, and Xception, are preferred for automatic plant disease detection due to their high performance, but they demand substantial computational resources, limiting their use to a class of large‐scale farmers. The proposed study developed a novel CNN model that is suitable for small‐scale farmers. The numerical outcomes indicate that the proposed model surpassed the state‐of‐the‐art models by achieving an average accuracy of 96.86%. The proposed model utilized comparatively limited computational resources as analyzed through floating‐point operations (FLOPs), number of parameters, computation time, and model's size. Furthermore, a statistical approach was proposed to analyze a model while collectively accounting for its performance and computational complexity. It is observed from the results that the proposed model outperformed the state‐of‐the‐art techniques in terms of both average recognition accuracy and computational complexity.","PeriodicalId":502604,"journal":{"name":"Engineering Reports","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-06-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Engineering Reports","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1002/eng2.12944","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Agricultural plants are the fundamental source of nutrients worldwide. The attack of diseases on these plants leads to food scarcity and results in a catastrophic situation. These diseases can be prevented by using manual or automatic approaches. The manual approach, where plant pathologists inspect fields, is costly, error‐prone, and time‐consuming. Alternatively, automatic approaches utilize 2D plant images processed through machine learning. The current study opts for the later approach due to its advantages in terms of speed, efficiency, and convenience. Convolutional neural network (CNN)‐based prominent models, such as MobileNet, ResNet50, Inception, and Xception, are preferred for automatic plant disease detection due to their high performance, but they demand substantial computational resources, limiting their use to a class of large‐scale farmers. The proposed study developed a novel CNN model that is suitable for small‐scale farmers. The numerical outcomes indicate that the proposed model surpassed the state‐of‐the‐art models by achieving an average accuracy of 96.86%. The proposed model utilized comparatively limited computational resources as analyzed through floating‐point operations (FLOPs), number of parameters, computation time, and model's size. Furthermore, a statistical approach was proposed to analyze a model while collectively accounting for its performance and computational complexity. It is observed from the results that the proposed model outperformed the state‐of‐the‐art techniques in terms of both average recognition accuracy and computational complexity.