{"title":"Crop Yield Prediction Using Multimodal Meta-Transformer and Temporal Graph Neural Networks","authors":"Somrita Sarkar;Anamika Dey;Ritam Pradhan;Upendra Mohan Sarkar;Chandranath Chatterjee;Arijit Mondal;Pabitra Mitra","doi":"10.1109/TAFE.2024.3438330","DOIUrl":null,"url":null,"abstract":"Crop yield prediction is a crucial task in agricultural science, involving the classification of potential yield into various levels. This is vital for both farmers and policymakers. The features considered for this task are diverse, including weather, soil, and historical yield data. Recently, plant images captured in different modalities, such as red–green–blue, infrared, and multispectral bands, have also been utilized. Most of these data are inherently temporal. Integrating such multimodal and temporal data is advantageous for yield classification. In this work, a deep learning framework based on meta-transformers and temporal graph neural networks has been proposed to achieve this goal. Meta-Transformers allow the modeling of multimodal interactions, while temporayel graph neural networks enable the utilization of time sequences. Experimental results on the publicly available EPFL multimodal dataset demonstrate that the proposed framework achieves a high classification accuracy of nearly 97%, surpassing other state-of-the-art models, such as long short-term memory networks, 1-D convolutional neural networks, and Transformers. In addition, the proposed model excels in accuracy metrics, with a precision of approximately 98%, an F1-Score of 91%, and a recall of 94% in crop yield prediction.","PeriodicalId":100637,"journal":{"name":"IEEE Transactions on AgriFood Electronics","volume":"2 2","pages":"545-553"},"PeriodicalIF":0.0000,"publicationDate":"2024-08-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on AgriFood Electronics","FirstCategoryId":"1085","ListUrlMain":"https://ieeexplore.ieee.org/document/10638066/","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Crop yield prediction is a crucial task in agricultural science, involving the classification of potential yield into various levels. This is vital for both farmers and policymakers. The features considered for this task are diverse, including weather, soil, and historical yield data. Recently, plant images captured in different modalities, such as red–green–blue, infrared, and multispectral bands, have also been utilized. Most of these data are inherently temporal. Integrating such multimodal and temporal data is advantageous for yield classification. In this work, a deep learning framework based on meta-transformers and temporal graph neural networks has been proposed to achieve this goal. Meta-Transformers allow the modeling of multimodal interactions, while temporayel graph neural networks enable the utilization of time sequences. Experimental results on the publicly available EPFL multimodal dataset demonstrate that the proposed framework achieves a high classification accuracy of nearly 97%, surpassing other state-of-the-art models, such as long short-term memory networks, 1-D convolutional neural networks, and Transformers. In addition, the proposed model excels in accuracy metrics, with a precision of approximately 98%, an F1-Score of 91%, and a recall of 94% in crop yield prediction.