轻量级 CNN 与知识蒸馏相结合,准确测定红茶发酵程度

IF 7 1区 农林科学 Q1 FOOD SCIENCE & TECHNOLOGY
Zezhong Ding , Chongshan Yang , Bin Hu , Mengqi Guo , Jinggang Li , Mengjie Wang , Zhengrui Tian , Zhiwei Chen , Chunwang Dong
{"title":"轻量级 CNN 与知识蒸馏相结合,准确测定红茶发酵程度","authors":"Zezhong Ding ,&nbsp;Chongshan Yang ,&nbsp;Bin Hu ,&nbsp;Mengqi Guo ,&nbsp;Jinggang Li ,&nbsp;Mengjie Wang ,&nbsp;Zhengrui Tian ,&nbsp;Zhiwei Chen ,&nbsp;Chunwang Dong","doi":"10.1016/j.foodres.2024.114929","DOIUrl":null,"url":null,"abstract":"<div><p>Black tea is the second most common type of tea in China. Fermentation is one of the most critical processes in its production, and it affects the quality of the finished product, whether it is insufficient or excessive. At present, the determination of black tea fermentation degree completely relies on artificial experience. It leads to inconsistent quality of black tea. To solve this problem, we use machine vision technology to distinguish the degree of fermentation of black tea based on images, this paper proposes a lightweight convolutional neural network (CNN) combined with knowledge distillation to discriminate the degree of fermentation of black tea. After comparing 12 kinds of CNN models, taking into account the size of the model and the performance of discrimination, as well as the selection principle of teacher models, Shufflenet_v2_x1.0 is selected as the student model, and Efficientnet_v2 is selected as the teacher model. Then, CrossEntropy Loss is replaced by Focal Loss. Finally, for Distillation Loss ratios of 0.6, 0.7, 0.8, 0.9, Soft Target Knowledge Distillation (ST), Masked Generative Distillation (MGD), Similarity-Preserving Knowledge Distillation (SPKD), and Attention Transfer (AT) four knowledge distillation methods are tested for their performance in distilling knowledge from the Shufflenet_v2_x1.0 model. The results show that the model discrimination performance after distillation is the best when the Distillation Loss ratio is 0.8 and the MGD method is used. This setup effectively improves the discrimination performance without increasing the number of parameters and computation volume. The model’s P, R and F1 values reach 0.9208, 0.9190 and 0.9192, respectively. It achieves precise discrimination of the fermentation degree of black tea. This meets the requirements of objective black tea fermentation judgment and provides technical support for the intelligent processing of black tea.</p></div>","PeriodicalId":323,"journal":{"name":"Food Research International","volume":null,"pages":null},"PeriodicalIF":7.0000,"publicationDate":"2024-08-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Lightweight CNN combined with knowledge distillation for the accurate determination of black tea fermentation degree\",\"authors\":\"Zezhong Ding ,&nbsp;Chongshan Yang ,&nbsp;Bin Hu ,&nbsp;Mengqi Guo ,&nbsp;Jinggang Li ,&nbsp;Mengjie Wang ,&nbsp;Zhengrui Tian ,&nbsp;Zhiwei Chen ,&nbsp;Chunwang Dong\",\"doi\":\"10.1016/j.foodres.2024.114929\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Black tea is the second most common type of tea in China. Fermentation is one of the most critical processes in its production, and it affects the quality of the finished product, whether it is insufficient or excessive. At present, the determination of black tea fermentation degree completely relies on artificial experience. It leads to inconsistent quality of black tea. To solve this problem, we use machine vision technology to distinguish the degree of fermentation of black tea based on images, this paper proposes a lightweight convolutional neural network (CNN) combined with knowledge distillation to discriminate the degree of fermentation of black tea. After comparing 12 kinds of CNN models, taking into account the size of the model and the performance of discrimination, as well as the selection principle of teacher models, Shufflenet_v2_x1.0 is selected as the student model, and Efficientnet_v2 is selected as the teacher model. Then, CrossEntropy Loss is replaced by Focal Loss. Finally, for Distillation Loss ratios of 0.6, 0.7, 0.8, 0.9, Soft Target Knowledge Distillation (ST), Masked Generative Distillation (MGD), Similarity-Preserving Knowledge Distillation (SPKD), and Attention Transfer (AT) four knowledge distillation methods are tested for their performance in distilling knowledge from the Shufflenet_v2_x1.0 model. The results show that the model discrimination performance after distillation is the best when the Distillation Loss ratio is 0.8 and the MGD method is used. This setup effectively improves the discrimination performance without increasing the number of parameters and computation volume. The model’s P, R and F1 values reach 0.9208, 0.9190 and 0.9192, respectively. It achieves precise discrimination of the fermentation degree of black tea. This meets the requirements of objective black tea fermentation judgment and provides technical support for the intelligent processing of black tea.</p></div>\",\"PeriodicalId\":323,\"journal\":{\"name\":\"Food Research International\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":7.0000,\"publicationDate\":\"2024-08-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Food Research International\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0963996924009992\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"FOOD SCIENCE & TECHNOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Food Research International","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0963996924009992","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"FOOD SCIENCE & TECHNOLOGY","Score":null,"Total":0}
引用次数: 0

摘要

红茶是中国第二大茶类。发酵是其生产过程中最关键的工序之一,发酵不足或过度都会影响成品的品质。目前,红茶发酵度的测定完全依靠人工经验,导致红茶品质不稳定。这导致红茶质量不稳定。为了解决这一问题,我们利用机器视觉技术,基于图像来判别红茶的发酵程度,本文提出了一种结合知识提炼的轻量级卷积神经网络(CNN)来判别红茶的发酵程度。在比较了 12 种 CNN 模型后,考虑到模型的大小和判别性能,以及教师模型的选择原则,选择 Shufflenet_v2_x1.0 作为学生模型,Efficientnet_v2 作为教师模型。然后,用 Focal Loss 代替 CrossEntropy Loss。最后,针对 0.6、0.7、0.8、0.9 的蒸馏损失比,测试了软目标知识蒸馏(ST)、屏蔽生成蒸馏(MGD)、相似性保留知识蒸馏(SPKD)和注意力转移(AT)四种知识蒸馏方法在从 Shufflenet_v2_x1.0 模型中蒸馏知识方面的性能。结果表明,当蒸馏损失比为 0.8 并使用 MGD 方法时,蒸馏后的模型判别性能最佳。这种设置在不增加参数数量和计算量的情况下有效地提高了判别性能。模型的 P 值、R 值和 F1 值分别达到 0.9208、0.9190 和 0.9192。实现了对红茶发酵程度的精确判别。这满足了客观判断红茶发酵程度的要求,为红茶的智能化加工提供了技术支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。

Lightweight CNN combined with knowledge distillation for the accurate determination of black tea fermentation degree

Lightweight CNN combined with knowledge distillation for the accurate determination of black tea fermentation degree

Black tea is the second most common type of tea in China. Fermentation is one of the most critical processes in its production, and it affects the quality of the finished product, whether it is insufficient or excessive. At present, the determination of black tea fermentation degree completely relies on artificial experience. It leads to inconsistent quality of black tea. To solve this problem, we use machine vision technology to distinguish the degree of fermentation of black tea based on images, this paper proposes a lightweight convolutional neural network (CNN) combined with knowledge distillation to discriminate the degree of fermentation of black tea. After comparing 12 kinds of CNN models, taking into account the size of the model and the performance of discrimination, as well as the selection principle of teacher models, Shufflenet_v2_x1.0 is selected as the student model, and Efficientnet_v2 is selected as the teacher model. Then, CrossEntropy Loss is replaced by Focal Loss. Finally, for Distillation Loss ratios of 0.6, 0.7, 0.8, 0.9, Soft Target Knowledge Distillation (ST), Masked Generative Distillation (MGD), Similarity-Preserving Knowledge Distillation (SPKD), and Attention Transfer (AT) four knowledge distillation methods are tested for their performance in distilling knowledge from the Shufflenet_v2_x1.0 model. The results show that the model discrimination performance after distillation is the best when the Distillation Loss ratio is 0.8 and the MGD method is used. This setup effectively improves the discrimination performance without increasing the number of parameters and computation volume. The model’s P, R and F1 values reach 0.9208, 0.9190 and 0.9192, respectively. It achieves precise discrimination of the fermentation degree of black tea. This meets the requirements of objective black tea fermentation judgment and provides technical support for the intelligent processing of black tea.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Food Research International
Food Research International 工程技术-食品科技
CiteScore
12.50
自引率
7.40%
发文量
1183
审稿时长
79 days
期刊介绍: Food Research International serves as a rapid dissemination platform for significant and impactful research in food science, technology, engineering, and nutrition. The journal focuses on publishing novel, high-quality, and high-impact review papers, original research papers, and letters to the editors across various disciplines in the science and technology of food. Additionally, it follows a policy of publishing special issues on topical and emergent subjects in food research or related areas. Selected, peer-reviewed papers from scientific meetings, workshops, and conferences on the science, technology, and engineering of foods are also featured in special issues.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信