CNN-Transformer fusion for improving maize kernel storage Quality: A new approach to reducing storage risks

IF 2.7 2区 农林科学 Q1 ENTOMOLOGY
Like Zhao , Weishi Jia , Mengting Tao , Huawei Jiang , Zhen Yang , Xixi Liu
{"title":"CNN-Transformer fusion for improving maize kernel storage Quality: A new approach to reducing storage risks","authors":"Like Zhao ,&nbsp;Weishi Jia ,&nbsp;Mengting Tao ,&nbsp;Huawei Jiang ,&nbsp;Zhen Yang ,&nbsp;Xixi Liu","doi":"10.1016/j.jspr.2025.102722","DOIUrl":null,"url":null,"abstract":"<div><div>The efficient identification of imperfect maize kernels is of crucial importance for optimizing storage management and ensuring quality preservation. Image-based technologies, known for their rapid and non-destructive characteristics, play a pivotal role in this endeavor. However, these technologies encounter significant challenges due to high inter-class similarity and substantial intra-class variability among maize kernels. To address these challenges, we propose CTNet, an advanced model that synergistically integrates the Convolutional Neural Network (CNN) with the Transformer architecture. This integration is further enhanced by incorporating a Feature Attention Module (FAM) and a DW-Swin Transformer, which collectively facilitate the fusion of local and global features. A Fine-Grained Perception Module (FGPM) is employed to augment the model's sensitivity to subtle imperfections. The model, supplemented with a linear classifier, ensures precise discrimination of kernel quality. CTNet was trained on the comprehensive GrainSpace dataset and achieved 1.35 % higher accuracy than the baseline model, SwinTransformer, on the test set. Compared with other methods, our model demonstrates higher accuracy in identifying defective maize grains and achieves more efficient parameter utilization in its structure. Despite moderate parameters and FLOPs, it enhances recognition performance while maintaining low complexity. As imperfect maize grains directly impact quality and storage safety, the model's effective recognition capability is critical for reducing storage losses and ensuring safety, thus providing strong support for smart agriculture.</div></div>","PeriodicalId":17019,"journal":{"name":"Journal of Stored Products Research","volume":"114 ","pages":"Article 102722"},"PeriodicalIF":2.7000,"publicationDate":"2025-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Stored Products Research","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0022474X2500181X","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENTOMOLOGY","Score":null,"Total":0}
引用次数: 0

Abstract

The efficient identification of imperfect maize kernels is of crucial importance for optimizing storage management and ensuring quality preservation. Image-based technologies, known for their rapid and non-destructive characteristics, play a pivotal role in this endeavor. However, these technologies encounter significant challenges due to high inter-class similarity and substantial intra-class variability among maize kernels. To address these challenges, we propose CTNet, an advanced model that synergistically integrates the Convolutional Neural Network (CNN) with the Transformer architecture. This integration is further enhanced by incorporating a Feature Attention Module (FAM) and a DW-Swin Transformer, which collectively facilitate the fusion of local and global features. A Fine-Grained Perception Module (FGPM) is employed to augment the model's sensitivity to subtle imperfections. The model, supplemented with a linear classifier, ensures precise discrimination of kernel quality. CTNet was trained on the comprehensive GrainSpace dataset and achieved 1.35 % higher accuracy than the baseline model, SwinTransformer, on the test set. Compared with other methods, our model demonstrates higher accuracy in identifying defective maize grains and achieves more efficient parameter utilization in its structure. Despite moderate parameters and FLOPs, it enhances recognition performance while maintaining low complexity. As imperfect maize grains directly impact quality and storage safety, the model's effective recognition capability is critical for reducing storage losses and ensuring safety, thus providing strong support for smart agriculture.
CNN-Transformer融合提高玉米籽粒储存质量:降低储存风险的新方法
不完美玉米粒的有效鉴定对于优化贮藏管理和保证品质保存具有重要意义。基于图像的技术以其快速和非破坏性的特点而闻名,在这一努力中发挥着关键作用。然而,由于玉米籽粒之间的高类间相似性和大量的类内变异性,这些技术遇到了重大挑战。为了应对这些挑战,我们提出了CTNet,这是一种将卷积神经网络(CNN)与Transformer架构协同集成的先进模型。通过结合特征注意模块(FAM)和DW-Swin变压器,进一步增强了这种集成,它们共同促进了局部和全局特征的融合。采用细粒度感知模块(FGPM)增强模型对细微缺陷的敏感性。该模型辅以线性分类器,保证了核质量的精确判别。CTNet在综合的GrainSpace数据集上进行训练,在测试集上的准确率比基线模型SwinTransformer高1.35%。与其他方法相比,我们的模型在识别缺陷玉米籽粒方面具有更高的准确性,并且在其结构中实现了更有效的参数利用。尽管参数和FLOPs适中,但它在保持低复杂度的同时提高了识别性能。由于不完美的玉米籽粒直接影响质量和储存安全,因此该模型的有效识别能力对于减少储存损失和确保安全至关重要,从而为智慧农业提供强有力的支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
5.70
自引率
18.50%
发文量
112
审稿时长
45 days
期刊介绍: The Journal of Stored Products Research provides an international medium for the publication of both reviews and original results from laboratory and field studies on the preservation and safety of stored products, notably food stocks, covering storage-related problems from the producer through the supply chain to the consumer. Stored products are characterised by having relatively low moisture content and include raw and semi-processed foods, animal feedstuffs, and a range of other durable items, including materials such as clothing or museum artefacts.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信