Shih-Yu Chen , Yu-Cheng Wu , Yung-Ming Kuo , Rui-Hong Zhang , Tsai-Yi Cheng , Yu-Chien Chen , Po-Yu Chu , Li-Wei Kang , Chinsu Lin
{"title":"Automated peanut defect detection using hyperspectral imaging and deep learning: A real-time approach for smart agriculture","authors":"Shih-Yu Chen , Yu-Cheng Wu , Yung-Ming Kuo , Rui-Hong Zhang , Tsai-Yi Cheng , Yu-Chien Chen , Po-Yu Chu , Li-Wei Kang , Chinsu Lin","doi":"10.1016/j.atech.2025.100943","DOIUrl":null,"url":null,"abstract":"<div><div>Manual visual inspection remains the prevailing approach for peanut quality classification; however, it is labor-intensive, prone to fatigue-induced errors, and often results in inconsistent outcomes. Peanut defects are typically categorized into four classes: healthy, underdeveloped, insect-damaged, and ruptured. This paper proposes an automated classification framework that integrates push-broom and snapshot hyperspectral imaging techniques with deep learning models for accurate and efficient peanut defect detection. A push-broom hyperspectral imaging system was employed to acquire a dataset of 1557 peanut samples, divided into a training set (477 samples: 237 healthy, 240 defective) and a test set (1080 samples). Spectral band selection was applied to reduce data dimensionality, followed by the development and evaluation of 1D, 2D, and 3D Convolutional Neural Network (CNN) models. Among them, the 3D-CNN architecture achieved the highest classification accuracy of 98 %. In addition, the snapshot imaging system enabled the construction of a lightweight CNN model for real-time defect detection. Principal Component Analysis (PCA) was utilized to identify five informative spectral bands, enabling efficient classification with an overall accuracy of 98.5 % and a Kappa coefficient of 97.3 %. The novelty of this study lies in the dual integration of push-broom and snapshot hyperspectral imaging with hybrid CNN architectures, enabling both high-accuracy offline analysis and lightweight real-time detection. The combination of spectral dimensionality reduction and attention-based modeling presents a scalable and computationally efficient solution for quality assessment. These findings represent a significant advancement in automated peanut grading, offering a robust, cost-effective, and scalable approach for deployment in smart agriculture and automated food quality control systems.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"11 ","pages":"Article 100943"},"PeriodicalIF":6.3000,"publicationDate":"2025-04-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525001765","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Manual visual inspection remains the prevailing approach for peanut quality classification; however, it is labor-intensive, prone to fatigue-induced errors, and often results in inconsistent outcomes. Peanut defects are typically categorized into four classes: healthy, underdeveloped, insect-damaged, and ruptured. This paper proposes an automated classification framework that integrates push-broom and snapshot hyperspectral imaging techniques with deep learning models for accurate and efficient peanut defect detection. A push-broom hyperspectral imaging system was employed to acquire a dataset of 1557 peanut samples, divided into a training set (477 samples: 237 healthy, 240 defective) and a test set (1080 samples). Spectral band selection was applied to reduce data dimensionality, followed by the development and evaluation of 1D, 2D, and 3D Convolutional Neural Network (CNN) models. Among them, the 3D-CNN architecture achieved the highest classification accuracy of 98 %. In addition, the snapshot imaging system enabled the construction of a lightweight CNN model for real-time defect detection. Principal Component Analysis (PCA) was utilized to identify five informative spectral bands, enabling efficient classification with an overall accuracy of 98.5 % and a Kappa coefficient of 97.3 %. The novelty of this study lies in the dual integration of push-broom and snapshot hyperspectral imaging with hybrid CNN architectures, enabling both high-accuracy offline analysis and lightweight real-time detection. The combination of spectral dimensionality reduction and attention-based modeling presents a scalable and computationally efficient solution for quality assessment. These findings represent a significant advancement in automated peanut grading, offering a robust, cost-effective, and scalable approach for deployment in smart agriculture and automated food quality control systems.