Single-wavelength near-infrared imaging and machine learning for detecting Queensland fruit fly damage in cherries

IF 5.7 Q1 AGRICULTURAL ENGINEERING
Maryam Yazdani , Dong Bao , Jun Zhou , Andy Wang , Rieks D. van Klinken
{"title":"Single-wavelength near-infrared imaging and machine learning for detecting Queensland fruit fly damage in cherries","authors":"Maryam Yazdani ,&nbsp;Dong Bao ,&nbsp;Jun Zhou ,&nbsp;Andy Wang ,&nbsp;Rieks D. van Klinken","doi":"10.1016/j.atech.2025.101090","DOIUrl":null,"url":null,"abstract":"<div><div>Efficient detection and removal of infested fruits can be a valuable tool for reducing the spread of quarantine pests through trade. Automated grading technologies offer non-destructive solutions for detecting fruit fly infestations, though current optical methods face challenges due to either high computational demands (hyperspectral) or low specificity (multi- and single-spectral). In this study, we introduced a novel imaging method and machine learning approach to detect Queensland fruit fly (Qfly) infestations in fresh cherries, at both the image and fruit levels. Using hyperspectral imaging (HSI), we identified a wavelength of 730 nm within the visible to near-infrared (NIR) spectrum as most effective for distinguishing Qfly oviposition damage from natural pigmentation and mechanical damage. A library of 1771 high-resolution, single-wavelength NIR images was created, with Qfly oviposition sites manually labelled for model training. We proposed a novel machine learning approach called the Bounding Box Histogram Fusion Classifier (BBHFC). This method transforms spot-level predictions of Qfly oviposition damage, generated by a trained object detection model, into histogram-based feature vectors. These vectors are then used for efficient and accurate image-level infestation classification. BBHFC achieved high precision, recall, and F1 scores (all &gt; 0.93), demonstrating the effectiveness of the approach. The proposed BBHFC outperformed traditional visual inspection, achieving over 89 % accuracy, compared to 60 % for manual detection. Integrating advanced imaging techniques into grading systems can significantly enhance biosecurity in horticultural industries by detecting and removing infested fruit. This technology could also supplement existing, costly, visual inspections of traded fruit that governments are required to undertake.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101090"},"PeriodicalIF":5.7000,"publicationDate":"2025-06-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525003235","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Efficient detection and removal of infested fruits can be a valuable tool for reducing the spread of quarantine pests through trade. Automated grading technologies offer non-destructive solutions for detecting fruit fly infestations, though current optical methods face challenges due to either high computational demands (hyperspectral) or low specificity (multi- and single-spectral). In this study, we introduced a novel imaging method and machine learning approach to detect Queensland fruit fly (Qfly) infestations in fresh cherries, at both the image and fruit levels. Using hyperspectral imaging (HSI), we identified a wavelength of 730 nm within the visible to near-infrared (NIR) spectrum as most effective for distinguishing Qfly oviposition damage from natural pigmentation and mechanical damage. A library of 1771 high-resolution, single-wavelength NIR images was created, with Qfly oviposition sites manually labelled for model training. We proposed a novel machine learning approach called the Bounding Box Histogram Fusion Classifier (BBHFC). This method transforms spot-level predictions of Qfly oviposition damage, generated by a trained object detection model, into histogram-based feature vectors. These vectors are then used for efficient and accurate image-level infestation classification. BBHFC achieved high precision, recall, and F1 scores (all > 0.93), demonstrating the effectiveness of the approach. The proposed BBHFC outperformed traditional visual inspection, achieving over 89 % accuracy, compared to 60 % for manual detection. Integrating advanced imaging techniques into grading systems can significantly enhance biosecurity in horticultural industries by detecting and removing infested fruit. This technology could also supplement existing, costly, visual inspections of traded fruit that governments are required to undertake.

Abstract Image

单波长近红外成像和机器学习检测昆士兰果蝇对樱桃的伤害
有效地检测和清除受感染的水果可以成为减少检疫性有害生物通过贸易传播的宝贵工具。自动分级技术为检测果蝇侵扰提供了非破坏性的解决方案,尽管目前的光学方法面临着高计算需求(高光谱)或低特异性(多光谱和单光谱)的挑战。在这项研究中,我们引入了一种新的成像方法和机器学习方法,在图像和果实水平上检测新鲜樱桃中的昆士兰果蝇(Qfly)侵扰。利用高光谱成像(HSI),我们确定了可见光至近红外(NIR)光谱中730 nm的波长是区分Qfly产卵损伤与自然色素沉着和机械损伤最有效的波长。创建了1771张高分辨率单波长近红外图像库,并手动标记了Qfly产卵地点以用于模型训练。我们提出了一种新的机器学习方法,称为边界盒直方图融合分类器(BBHFC)。该方法将经过训练的目标检测模型生成的Qfly产卵损伤的点级预测转换为基于直方图的特征向量。然后使用这些媒介进行有效和准确的图像级虫害分类。BBHFC实现了高精度、召回率和F1分数(均为>;0.93),表明该方法的有效性。提出的BBHFC优于传统的目视检测,准确率超过89%,而人工检测的准确率为60%。将先进的成像技术整合到分级系统中,可以通过检测和清除侵染水果,显著提高园艺产业的生物安全性。这项技术还可以补充现有的、昂贵的、政府必须对交易水果进行的目视检查。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.20
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信