Jiapan Li , Yan Zhang , Yong Zhang , Hongwei Shi , Xianfang Song , Chao Peng
{"title":"MIF-YOLO: An Enhanced YOLO with Multi-Source Image Fusion for Autonomous Dead Chicken Detection","authors":"Jiapan Li , Yan Zhang , Yong Zhang , Hongwei Shi , Xianfang Song , Chao Peng","doi":"10.1016/j.atech.2025.101104","DOIUrl":null,"url":null,"abstract":"<div><div>Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a [email protected] metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.</div></div>","PeriodicalId":74813,"journal":{"name":"Smart agricultural technology","volume":"12 ","pages":"Article 101104"},"PeriodicalIF":6.3000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Smart agricultural technology","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2772375525003375","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURAL ENGINEERING","Score":null,"Total":0}
引用次数: 0
Abstract
Addressing the paucity of automated systems for the detection of dead poultry within large-scale agricultural settings, characterized by the onerous and time-consuming manual inspection processes, this study introduces an enhanced YOLO algorithm with multi-source image fusion (MIF-YOLO) for the autonomous identification of dead chicken. The proposed approach commences with the application of progressive illumination-ware fusion (PIA Fusion) to amalgamate thermal infrared and visible-light imagery, thereby accentuating the salient features indicative of dead chickens and counteracting the impact of non-uniform illumination. To address the challenge of feature extraction under conditions of significant occlusion, the model incorporates the Rep-DCNv3 module, which augments the backbone network's capacity to discern subtle characteristics of dead chickens. Additionally, an exponential moving average (EMA) attention mechanism is strategically embedded within the YOLO algorithm architecture's neck region to bolster the model's ability to discern targets under low-light scenarios, enhancing both its accuracy rates and adaptability. The loss function of the model is refined through the implementation of Modified Partial Distance-IoU (MPDIoU), facilitating a more nuanced evaluation of the overlap of objects. Validated against a dataset comprising caged white-feathered chickens procured from a farm in Suqian, Jiangsu Province, the empirical findings indicate that the model attains a precision of 99.2% and a [email protected] metric of 98.9%, surpassing the performance of existing cutting-edge methodologies. The innovative detection methodology for dead chickens ensures not only rapid detection, but also marked improvement in detection fidelity, aligning with the demands of real-time monitoring in operational agricultural contexts.