{"title":"出生后仔猪在松散猪圈中诱捕事件的自动检测:YOLO版本4、5和8的比较","authors":"Taeyong Yun, Jinsul Kim, Jinhyeon Yun, Tai-Won Um","doi":"10.5187/jast.2024.e106","DOIUrl":null,"url":null,"abstract":"<p><p>In recent years, the pig industry has experienced an alarming surge in piglet mortality shortly after farrowing due to crushing by the sow. This issue has been exacerbated by the adoption of hyperprolific sows and the transition to loose housing pens, adversely affecting both animal welfare and productivity. In response to these challenges, researchers have progressively turned to artificial intelligence of things (AIoT) to address various issues within the livestock sector. The primary objective of this study was to conduct a comparative analysis of different versions of object detection algorithms, aiming to identify the optimal AIoT system for monitoring piglet crushing events based on performance and practicality. The methodology involved extracting relevant footage depicting instances of piglet crushing from recorded farrowing pen videos, which were subsequently condensed into 2-3 min edited clips. These clips were categorized into three classes: no trapping, trapping, and crushing. Data augmentation techniques, including rotation, flipping, and adjustments to saturation and contrast, were applied to enhance the dataset. This study employed three deep learning object recognition algorithms-- You Only Look Once (YOLO)v4-Tiny, YOLOv5s and YOLOv8s--followed by a performance analysis. The average precision (AP) for trapping detection across the models yielded values of 0.963 for YOLOv4-Tiny, and 0.995 for both YOLOv5s, and YOLOv8s. Notably, trapping detection performance was similar between YOLOv5s and YOLOv8s. However, YOLOv5s proved to be the best choice considering its model size of 13.6 MB compared to YOLOv4-Tiny's 22.4 MB and YOLOv8's 21.4 MB. Considering both performance metrics and model size, YOLOv5s emerges as the most suitable model for detecting trapping within an AIoT framework. Future endeavors may leverage this research to refine and expand the scope of AIoT applications in addressing challenges within the pig industry, ultimately contributing to advancements in both animal husbandry practices and technological solutions.</p>","PeriodicalId":14923,"journal":{"name":"Journal of Animal Science and Technology","volume":"67 3","pages":"666-676"},"PeriodicalIF":2.7000,"publicationDate":"2025-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12159704/pdf/","citationCount":"0","resultStr":"{\"title\":\"Automatic detection of trapping events of postnatal piglets in loose housing pen: comparison of YOLO versions 4, 5, and 8.\",\"authors\":\"Taeyong Yun, Jinsul Kim, Jinhyeon Yun, Tai-Won Um\",\"doi\":\"10.5187/jast.2024.e106\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>In recent years, the pig industry has experienced an alarming surge in piglet mortality shortly after farrowing due to crushing by the sow. This issue has been exacerbated by the adoption of hyperprolific sows and the transition to loose housing pens, adversely affecting both animal welfare and productivity. In response to these challenges, researchers have progressively turned to artificial intelligence of things (AIoT) to address various issues within the livestock sector. The primary objective of this study was to conduct a comparative analysis of different versions of object detection algorithms, aiming to identify the optimal AIoT system for monitoring piglet crushing events based on performance and practicality. The methodology involved extracting relevant footage depicting instances of piglet crushing from recorded farrowing pen videos, which were subsequently condensed into 2-3 min edited clips. These clips were categorized into three classes: no trapping, trapping, and crushing. Data augmentation techniques, including rotation, flipping, and adjustments to saturation and contrast, were applied to enhance the dataset. This study employed three deep learning object recognition algorithms-- You Only Look Once (YOLO)v4-Tiny, YOLOv5s and YOLOv8s--followed by a performance analysis. The average precision (AP) for trapping detection across the models yielded values of 0.963 for YOLOv4-Tiny, and 0.995 for both YOLOv5s, and YOLOv8s. Notably, trapping detection performance was similar between YOLOv5s and YOLOv8s. However, YOLOv5s proved to be the best choice considering its model size of 13.6 MB compared to YOLOv4-Tiny's 22.4 MB and YOLOv8's 21.4 MB. Considering both performance metrics and model size, YOLOv5s emerges as the most suitable model for detecting trapping within an AIoT framework. Future endeavors may leverage this research to refine and expand the scope of AIoT applications in addressing challenges within the pig industry, ultimately contributing to advancements in both animal husbandry practices and technological solutions.</p>\",\"PeriodicalId\":14923,\"journal\":{\"name\":\"Journal of Animal Science and Technology\",\"volume\":\"67 3\",\"pages\":\"666-676\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-05-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12159704/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Animal Science and Technology\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.5187/jast.2024.e106\",\"RegionNum\":3,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2025/5/31 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, DAIRY & ANIMAL SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Animal Science and Technology","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.5187/jast.2024.e106","RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/5/31 0:00:00","PubModel":"Epub","JCR":"Q1","JCRName":"AGRICULTURE, DAIRY & ANIMAL SCIENCE","Score":null,"Total":0}
引用次数: 0
摘要
近年来,养猪业经历了仔猪死亡率的惊人增长,在分娩后不久,由于母猪碾压。采用高产母猪和向松散猪圈的过渡加剧了这一问题,对动物福利和生产力都产生了不利影响。为了应对这些挑战,研究人员逐渐转向物联网人工智能(AIoT)来解决畜牧业中的各种问题。本研究的主要目的是对不同版本的目标检测算法进行比较分析,旨在根据性能和实用性确定用于监测仔猪破碎事件的最佳AIoT系统。该方法包括从录制的产猪圈视频中提取有关小猪碾压的镜头,随后将其浓缩为2-3分钟的剪辑片段。这些夹子被分为三类:不诱捕、诱捕和粉碎。数据增强技术,包括旋转、翻转、饱和度和对比度调整,被应用于增强数据集。本研究采用了三种深度学习对象识别算法——You Only Look Once (YOLO)v4-Tiny、YOLOv5s和YOLOv8s——然后进行了性能分析。YOLOv4-Tiny的捕获检测平均精度为0.963,yolov5和yolov8的捕获检测平均精度为0.995。值得注意的是,YOLOv5s和YOLOv8s之间的捕获检测性能相似。然而,与YOLOv4-Tiny的22.4 MB和YOLOv8的21.4 MB相比,考虑到其模型大小为13.6 MB, YOLOv5s被证明是最佳选择。考虑到性能指标和模型大小,YOLOv5s成为AIoT框架内检测陷阱最合适的模型。未来的努力可能会利用这项研究来完善和扩大AIoT在解决养猪业挑战中的应用范围,最终为畜牧业实践和技术解决方案的进步做出贡献。
Automatic detection of trapping events of postnatal piglets in loose housing pen: comparison of YOLO versions 4, 5, and 8.
In recent years, the pig industry has experienced an alarming surge in piglet mortality shortly after farrowing due to crushing by the sow. This issue has been exacerbated by the adoption of hyperprolific sows and the transition to loose housing pens, adversely affecting both animal welfare and productivity. In response to these challenges, researchers have progressively turned to artificial intelligence of things (AIoT) to address various issues within the livestock sector. The primary objective of this study was to conduct a comparative analysis of different versions of object detection algorithms, aiming to identify the optimal AIoT system for monitoring piglet crushing events based on performance and practicality. The methodology involved extracting relevant footage depicting instances of piglet crushing from recorded farrowing pen videos, which were subsequently condensed into 2-3 min edited clips. These clips were categorized into three classes: no trapping, trapping, and crushing. Data augmentation techniques, including rotation, flipping, and adjustments to saturation and contrast, were applied to enhance the dataset. This study employed three deep learning object recognition algorithms-- You Only Look Once (YOLO)v4-Tiny, YOLOv5s and YOLOv8s--followed by a performance analysis. The average precision (AP) for trapping detection across the models yielded values of 0.963 for YOLOv4-Tiny, and 0.995 for both YOLOv5s, and YOLOv8s. Notably, trapping detection performance was similar between YOLOv5s and YOLOv8s. However, YOLOv5s proved to be the best choice considering its model size of 13.6 MB compared to YOLOv4-Tiny's 22.4 MB and YOLOv8's 21.4 MB. Considering both performance metrics and model size, YOLOv5s emerges as the most suitable model for detecting trapping within an AIoT framework. Future endeavors may leverage this research to refine and expand the scope of AIoT applications in addressing challenges within the pig industry, ultimately contributing to advancements in both animal husbandry practices and technological solutions.
期刊介绍:
Journal of Animal Science and Technology (J. Anim. Sci. Technol. or JAST) is a peer-reviewed, open access journal publishing original research, review articles and notes in all fields of animal science.
Topics covered by the journal include: genetics and breeding, physiology, nutrition of monogastric animals, nutrition of ruminants, animal products (milk, meat, eggs and their by-products) and their processing, grasslands and roughages, livestock environment, animal biotechnology, animal behavior and welfare.
Articles generally report research involving beef cattle, dairy cattle, pigs, companion animals, goats, horses, and sheep. However, studies involving other farm animals, aquatic and wildlife species, and laboratory animal species that address fundamental questions related to livestock and companion animal biology will also be considered for publication.
The Journal of Animal Science and Technology (J. Anim. Technol. or JAST) has been the official journal of The Korean Society of Animal Science and Technology (KSAST) since 2000, formerly known as The Korean Journal of Animal Sciences (launched in 1956).