MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection

IF 6.3 Q1 AGRICULTURAL ENGINEERING
Jiapan Li , Yan Zhang , Yong Zhang , Hongwei Shi , Xianfang Song , Chao Peng
{"title":"MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection","authors":"Jiapan Li ,&nbsp;Yan Zhang ,&nbsp;Yong Zhang ,&nbsp;Hongwei Shi ,&nbsp;Xianfang Song ,&nbsp;Chao Peng","doi":"10.1016/j.atech.2025.101104","DOIUrl":null,"url":null,"abstract":"<div><div>Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a [email protected] metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101104"},"PeriodicalIF":6.3000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525003375","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0

Abstract

Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a [email protected] metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.
MIF-YOLO:一种基于多源图像融合的增强YOLO自主死鸡检测方法
针对大规模农业环境中缺乏用于死禽检测的自动化系统,其特点是人工检测过程繁琐且耗时,本研究引入了一种基于多源图像融合的增强YOLO算法(MIF-YOLO),用于死鸡的自主识别。该方法首先采用渐进式照度融合(PIA fusion)融合热红外和可见光图像,从而突出表明死鸡的显著特征,抵消光照不均匀的影响。为了解决严重遮挡条件下特征提取的挑战,该模型结合了Rep-DCNv3模块,该模块增强了骨干网识别死鸡细微特征的能力。此外,在YOLO算法架构的颈部区域战略性地嵌入了指数移动平均(EMA)注意力机制,以增强模型在低光照情况下识别目标的能力,提高其准确率和适应性。该模型的损失函数通过改进部分距离iou (MPDIoU)的实现进行了改进,便于对物体重叠进行更细致的评估。通过对从江苏宿迁一家农场采购的笼养白羽鸡的数据集进行验证,实证结果表明,该模型的精度为99.2%,[email protected]指标为98.9%,超过了现有尖端方法的性能。创新的死鸡检测方法不仅确保了快速检测,而且显著提高了检测保真度,符合实际农业环境中实时监测的要求。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
4.20
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信