Panqi Pu, Junge Wang, Geqi Yan, Hongchao Jiao, Hao Li, Hai Lin
{"title":"基于YOLOv8n的猪多场景行为识别","authors":"Panqi Pu, Junge Wang, Geqi Yan, Hongchao Jiao, Hao Li, Hai Lin","doi":"10.3390/ani15192927","DOIUrl":null,"url":null,"abstract":"<p><p>Advances in smart animal husbandry necessitate efficient pig behavior monitoring, yet traditional approaches suffer from operational inefficiency and animal stress. We address these limitations through a lightweight YOLOv8n architecture enhanced with SPD-Conv for feature preservation during downsampling, LSKBlock attention for contextual feature fusion, and a dedicated small-target detection head. Experimental validation demonstrates superior performance: the optimized model achieves a 92.4% mean average precision (mAP@0.5) and 87.4% recall, significantly outperforming baseline YOLOv8n by 3.7% in AP while maintaining minimal parameter growth (3.34M). Controlled illumination tests confirm enhanced robustness under strong and warm lighting conditions, with performance gains of 1.5% and 0.7% in AP, respectively. This high-precision framework enables real-time recognition of standing, prone lying, lateral lying, and feeding behaviors in commercial piggeries, supporting early health anomaly detection through non-invasive monitoring.</p>","PeriodicalId":7955,"journal":{"name":"Animals","volume":"15 19","pages":""},"PeriodicalIF":2.7000,"publicationDate":"2025-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12523645/pdf/","citationCount":"0","resultStr":"{\"title\":\"EnhancedMulti-Scenario Pig Behavior Recognition Based on YOLOv8n.\",\"authors\":\"Panqi Pu, Junge Wang, Geqi Yan, Hongchao Jiao, Hao Li, Hai Lin\",\"doi\":\"10.3390/ani15192927\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Advances in smart animal husbandry necessitate efficient pig behavior monitoring, yet traditional approaches suffer from operational inefficiency and animal stress. We address these limitations through a lightweight YOLOv8n architecture enhanced with SPD-Conv for feature preservation during downsampling, LSKBlock attention for contextual feature fusion, and a dedicated small-target detection head. Experimental validation demonstrates superior performance: the optimized model achieves a 92.4% mean average precision (mAP@0.5) and 87.4% recall, significantly outperforming baseline YOLOv8n by 3.7% in AP while maintaining minimal parameter growth (3.34M). Controlled illumination tests confirm enhanced robustness under strong and warm lighting conditions, with performance gains of 1.5% and 0.7% in AP, respectively. This high-precision framework enables real-time recognition of standing, prone lying, lateral lying, and feeding behaviors in commercial piggeries, supporting early health anomaly detection through non-invasive monitoring.</p>\",\"PeriodicalId\":7955,\"journal\":{\"name\":\"Animals\",\"volume\":\"15 19\",\"pages\":\"\"},\"PeriodicalIF\":2.7000,\"publicationDate\":\"2025-10-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12523645/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Animals\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://doi.org/10.3390/ani15192927\",\"RegionNum\":2,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, DAIRY & ANIMAL SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Animals","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.3390/ani15192927","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, DAIRY & ANIMAL SCIENCE","Score":null,"Total":0}
EnhancedMulti-Scenario Pig Behavior Recognition Based on YOLOv8n.
Advances in smart animal husbandry necessitate efficient pig behavior monitoring, yet traditional approaches suffer from operational inefficiency and animal stress. We address these limitations through a lightweight YOLOv8n architecture enhanced with SPD-Conv for feature preservation during downsampling, LSKBlock attention for contextual feature fusion, and a dedicated small-target detection head. Experimental validation demonstrates superior performance: the optimized model achieves a 92.4% mean average precision (mAP@0.5) and 87.4% recall, significantly outperforming baseline YOLOv8n by 3.7% in AP while maintaining minimal parameter growth (3.34M). Controlled illumination tests confirm enhanced robustness under strong and warm lighting conditions, with performance gains of 1.5% and 0.7% in AP, respectively. This high-precision framework enables real-time recognition of standing, prone lying, lateral lying, and feeding behaviors in commercial piggeries, supporting early health anomaly detection through non-invasive monitoring.
AnimalsAgricultural and Biological Sciences-Animal Science and Zoology
CiteScore
4.90
自引率
16.70%
发文量
3015
审稿时长
20.52 days
期刊介绍:
Animals (ISSN 2076-2615) is an international and interdisciplinary scholarly open access journal. It publishes original research articles, reviews, communications, and short notes that are relevant to any field of study that involves animals, including zoology, ethnozoology, animal science, animal ethics and animal welfare. However, preference will be given to those articles that provide an understanding of animals within a larger context (i.e., the animals'' interactions with the outside world, including humans). There is no restriction on the length of the papers. Our aim is to encourage scientists to publish their experimental and theoretical research in as much detail as possible. Full experimental details and/or method of study, must be provided for research articles. Articles submitted that involve subjecting animals to unnecessary pain or suffering will not be accepted, and all articles must be submitted with the necessary ethical approval (please refer to the Ethical Guidelines for more information).