Jie Zhou , Luo Liu , Tao Jiang , Haonan Tian , Mingxia Shen , Longshen Liu
{"title":"基于检测机器人的哺乳期母猪和仔猪行为检测新方法","authors":"Jie Zhou , Luo Liu , Tao Jiang , Haonan Tian , Mingxia Shen , Longshen Liu","doi":"10.1016/j.compag.2024.109613","DOIUrl":null,"url":null,"abstract":"<div><div>Accurately identifying behaviors exhibited by lactating sows and piglets is crucial for maintaining swine health and preventing farming crises. In the absence of dedicated swine behavior monitoring systems and the challenges of implementing cloud-based automated monitoring in large-scale farming, this study proposes a method utilizing inspection robots to detect behaviors of lactating sows and piglets. The inspection robot initially serves as a data acquisition and storage tool, collecting behavioral data such as sows postures (standing, sitting, lateral recumbency, and sternal recumbency) and activities of piglet groups (resting, suckling, and active behavior) within confined pens. The YOLOv8 series algorithms are then employed to identify static postures of sows, while the Temporal Shift Module (TSM) is used to recognize dynamic behaviors within piglet groups. These models are fine-tuned and deployed on the Jetson Nano edge computing platform. Experimental results show that YOLOv8n accurately identifies sow postures with a mean Average Precision (mAP) @0.5 of 97.08% and a frame rate of 36.4 FPS at an image resolution of 480 × 288, following TensorRT acceleration. For piglet behavior recognition, the TSM model, using ResNet50 as the backbone network, achieves a Top-1 accuracy of 93.63% in recognizing piglet behaviors. Replacing ResNet50 with MobileNetv2 slightly reduces the Top-1 accuracy to 90.81%; however, there is a significant improvement in inference speed on Jetson Nano for a single video clip with a processing duration of 542.51 ms, representing more than a 20-fold enhancement compared to TSM_ResNet50. The Kappa consistency analysis reveals moderate behavioral coherence among sows in different pens and piglet groups. The study offers insights into automated detection of behaviors lactating sows and piglets within large-scale intensive farming systems.</div></div>","PeriodicalId":50627,"journal":{"name":"Computers and Electronics in Agriculture","volume":"227 ","pages":"Article 109613"},"PeriodicalIF":7.7000,"publicationDate":"2024-11-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Novel Behavior Detection Method for Sows and Piglets during Lactation Based on an Inspection Robot\",\"authors\":\"Jie Zhou , Luo Liu , Tao Jiang , Haonan Tian , Mingxia Shen , Longshen Liu\",\"doi\":\"10.1016/j.compag.2024.109613\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Accurately identifying behaviors exhibited by lactating sows and piglets is crucial for maintaining swine health and preventing farming crises. In the absence of dedicated swine behavior monitoring systems and the challenges of implementing cloud-based automated monitoring in large-scale farming, this study proposes a method utilizing inspection robots to detect behaviors of lactating sows and piglets. The inspection robot initially serves as a data acquisition and storage tool, collecting behavioral data such as sows postures (standing, sitting, lateral recumbency, and sternal recumbency) and activities of piglet groups (resting, suckling, and active behavior) within confined pens. The YOLOv8 series algorithms are then employed to identify static postures of sows, while the Temporal Shift Module (TSM) is used to recognize dynamic behaviors within piglet groups. These models are fine-tuned and deployed on the Jetson Nano edge computing platform. Experimental results show that YOLOv8n accurately identifies sow postures with a mean Average Precision (mAP) @0.5 of 97.08% and a frame rate of 36.4 FPS at an image resolution of 480 × 288, following TensorRT acceleration. For piglet behavior recognition, the TSM model, using ResNet50 as the backbone network, achieves a Top-1 accuracy of 93.63% in recognizing piglet behaviors. Replacing ResNet50 with MobileNetv2 slightly reduces the Top-1 accuracy to 90.81%; however, there is a significant improvement in inference speed on Jetson Nano for a single video clip with a processing duration of 542.51 ms, representing more than a 20-fold enhancement compared to TSM_ResNet50. The Kappa consistency analysis reveals moderate behavioral coherence among sows in different pens and piglet groups. The study offers insights into automated detection of behaviors lactating sows and piglets within large-scale intensive farming systems.</div></div>\",\"PeriodicalId\":50627,\"journal\":{\"name\":\"Computers and Electronics in Agriculture\",\"volume\":\"227 \",\"pages\":\"Article 109613\"},\"PeriodicalIF\":7.7000,\"publicationDate\":\"2024-11-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computers and Electronics in Agriculture\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168169924010044\",\"RegionNum\":1,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computers and Electronics in Agriculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168169924010044","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, MULTIDISCIPLINARY","Score":null,"Total":0}
A Novel Behavior Detection Method for Sows and Piglets during Lactation Based on an Inspection Robot
Accurately identifying behaviors exhibited by lactating sows and piglets is crucial for maintaining swine health and preventing farming crises. In the absence of dedicated swine behavior monitoring systems and the challenges of implementing cloud-based automated monitoring in large-scale farming, this study proposes a method utilizing inspection robots to detect behaviors of lactating sows and piglets. The inspection robot initially serves as a data acquisition and storage tool, collecting behavioral data such as sows postures (standing, sitting, lateral recumbency, and sternal recumbency) and activities of piglet groups (resting, suckling, and active behavior) within confined pens. The YOLOv8 series algorithms are then employed to identify static postures of sows, while the Temporal Shift Module (TSM) is used to recognize dynamic behaviors within piglet groups. These models are fine-tuned and deployed on the Jetson Nano edge computing platform. Experimental results show that YOLOv8n accurately identifies sow postures with a mean Average Precision (mAP) @0.5 of 97.08% and a frame rate of 36.4 FPS at an image resolution of 480 × 288, following TensorRT acceleration. For piglet behavior recognition, the TSM model, using ResNet50 as the backbone network, achieves a Top-1 accuracy of 93.63% in recognizing piglet behaviors. Replacing ResNet50 with MobileNetv2 slightly reduces the Top-1 accuracy to 90.81%; however, there is a significant improvement in inference speed on Jetson Nano for a single video clip with a processing duration of 542.51 ms, representing more than a 20-fold enhancement compared to TSM_ResNet50. The Kappa consistency analysis reveals moderate behavioral coherence among sows in different pens and piglet groups. The study offers insights into automated detection of behaviors lactating sows and piglets within large-scale intensive farming systems.
期刊介绍:
Computers and Electronics in Agriculture provides international coverage of advancements in computer hardware, software, electronic instrumentation, and control systems applied to agricultural challenges. Encompassing agronomy, horticulture, forestry, aquaculture, and animal farming, the journal publishes original papers, reviews, and applications notes. It explores the use of computers and electronics in plant or animal agricultural production, covering topics like agricultural soils, water, pests, controlled environments, and waste. The scope extends to on-farm post-harvest operations and relevant technologies, including artificial intelligence, sensors, machine vision, robotics, networking, and simulation modeling. Its companion journal, Smart Agricultural Technology, continues the focus on smart applications in production agriculture.