Pooja Garg;Anusha Mishra;Rameez Raja;Ahlad Kumar;Manjunath V. Joshi;Vinay S Palaparthy
{"title":"集成物联网传感器和图像的多模式数据融合,用于Jamun作物病害检测和机器学习","authors":"Pooja Garg;Anusha Mishra;Rameez Raja;Ahlad Kumar;Manjunath V. Joshi;Vinay S Palaparthy","doi":"10.1109/TAFE.2025.3585065","DOIUrl":null,"url":null,"abstract":"In agricultural applications, traditional image and sensor-based methods for plant disease prediction face notable limitations. Image-based approaches often struggle with early-stage detection, while sensor-based methods prone to reliability issues due to potential system failures. This study addresses these challenges by integrating complementary data of the Jamun (Syzygium cumini) plant from Internet of Things (IoT)-enabled sensors and mobile-captured images to develop a hybrid machine learning (ML) model for early and accurate plant disease detection. The proposed model combines a multilayer perceptron (MLP) for processing numerical sensor inputs—ambient temperature, soil temperature, relative humidity, soil moisture, and leaf wetness duration—and a convolutional neural network (CNN) for analyzing leaf images labeled as leaf spot, anthracnose, or healthy. Outputs from the MLP and CNN concatenated and processed through an additional MLP to classify plant health effectively. Optimized with hidden layer configurations of 8-16-32-8 for the sensor-data MLP, 16--32-64-128_32-8-4 for the image-data CNN, and 4-3 layers for the final MLP, the model achieves a loss of 1% and an accuracy of 95%, outperforming state-of-the-art methods, such as DenseNet201-support vector machines (SVM) (87.23%) and gray level co-occurrence matrix-SVM (90%). Performance metrics demonstrate high precision (leaf spot: 0.93, anthracnose: 0.93, and healthy: 0.98), recall (leaf spot: 0.92, anthracnose: 0.95, and healthy: 0.96), and F1-scores (leaf spot: 0.92, anthracnose: 0.94, and healthy: 0.97). The model’s deployment on an Amazon Web Services cloud server enables real-time disease detection and classification, making it accessible for practical agricultural use. This sensor and image data integration offers a novel and robust solution to address the limitations of single-modality approaches.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"3 2","pages":"582-590"},"PeriodicalIF":0.0000,"publicationDate":"2025-08-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multimodal Data Fusion by Integrating IoT-Enabled Sensors and Images for Jamun Crop Disease Detection With Machine Learning\",\"authors\":\"Pooja Garg;Anusha Mishra;Rameez Raja;Ahlad Kumar;Manjunath V. Joshi;Vinay S Palaparthy\",\"doi\":\"10.1109/TAFE.2025.3585065\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In agricultural applications, traditional image and sensor-based methods for plant disease prediction face notable limitations. Image-based approaches often struggle with early-stage detection, while sensor-based methods prone to reliability issues due to potential system failures. This study addresses these challenges by integrating complementary data of the Jamun (Syzygium cumini) plant from Internet of Things (IoT)-enabled sensors and mobile-captured images to develop a hybrid machine learning (ML) model for early and accurate plant disease detection. The proposed model combines a multilayer perceptron (MLP) for processing numerical sensor inputs—ambient temperature, soil temperature, relative humidity, soil moisture, and leaf wetness duration—and a convolutional neural network (CNN) for analyzing leaf images labeled as leaf spot, anthracnose, or healthy. Outputs from the MLP and CNN concatenated and processed through an additional MLP to classify plant health effectively. Optimized with hidden layer configurations of 8-16-32-8 for the sensor-data MLP, 16--32-64-128_32-8-4 for the image-data CNN, and 4-3 layers for the final MLP, the model achieves a loss of 1% and an accuracy of 95%, outperforming state-of-the-art methods, such as DenseNet201-support vector machines (SVM) (87.23%) and gray level co-occurrence matrix-SVM (90%). Performance metrics demonstrate high precision (leaf spot: 0.93, anthracnose: 0.93, and healthy: 0.98), recall (leaf spot: 0.92, anthracnose: 0.95, and healthy: 0.96), and F1-scores (leaf spot: 0.92, anthracnose: 0.94, and healthy: 0.97). The model’s deployment on an Amazon Web Services cloud server enables real-time disease detection and classification, making it accessible for practical agricultural use. This sensor and image data integration offers a novel and robust solution to address the limitations of single-modality approaches.\",\"PeriodicalId\":100637,\"journal\":{\"name\":\"IEEE Transactions on AgriFood Electronics\",\"volume\":\"3 2\",\"pages\":\"582-590\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2025-08-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on AgriFood Electronics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/11119722/\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on AgriFood Electronics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/11119722/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multimodal Data Fusion by Integrating IoT-Enabled Sensors and Images for Jamun Crop Disease Detection With Machine Learning
In agricultural applications, traditional image and sensor-based methods for plant disease prediction face notable limitations. Image-based approaches often struggle with early-stage detection, while sensor-based methods prone to reliability issues due to potential system failures. This study addresses these challenges by integrating complementary data of the Jamun (Syzygium cumini) plant from Internet of Things (IoT)-enabled sensors and mobile-captured images to develop a hybrid machine learning (ML) model for early and accurate plant disease detection. The proposed model combines a multilayer perceptron (MLP) for processing numerical sensor inputs—ambient temperature, soil temperature, relative humidity, soil moisture, and leaf wetness duration—and a convolutional neural network (CNN) for analyzing leaf images labeled as leaf spot, anthracnose, or healthy. Outputs from the MLP and CNN concatenated and processed through an additional MLP to classify plant health effectively. Optimized with hidden layer configurations of 8-16-32-8 for the sensor-data MLP, 16--32-64-128_32-8-4 for the image-data CNN, and 4-3 layers for the final MLP, the model achieves a loss of 1% and an accuracy of 95%, outperforming state-of-the-art methods, such as DenseNet201-support vector machines (SVM) (87.23%) and gray level co-occurrence matrix-SVM (90%). Performance metrics demonstrate high precision (leaf spot: 0.93, anthracnose: 0.93, and healthy: 0.98), recall (leaf spot: 0.92, anthracnose: 0.95, and healthy: 0.96), and F1-scores (leaf spot: 0.92, anthracnose: 0.94, and healthy: 0.97). The model’s deployment on an Amazon Web Services cloud server enables real-time disease detection and classification, making it accessible for practical agricultural use. This sensor and image data integration offers a novel and robust solution to address the limitations of single-modality approaches.