Determining the posture and location of pigs using an object detection model under different lighting conditions.

IF 1.3 Q3 AGRICULTURE, DAIRY & ANIMAL SCIENCE
Translational Animal Science Pub Date : 2024-12-03 eCollection Date: 2024-01-01 DOI:10.1093/tas/txae167
Alice J Scaillierez, Tomás Izquierdo García-Faria, Harry Broers, Sofie E van Nieuwamerongen-de Koning, Rik P P J van der Tol, Eddie A M Bokkers, Iris J M M Boumans
{"title":"Determining the posture and location of pigs using an object detection model under different lighting conditions.","authors":"Alice J Scaillierez, Tomás Izquierdo García-Faria, Harry Broers, Sofie E van Nieuwamerongen-de Koning, Rik P P J van der Tol, Eddie A M Bokkers, Iris J M M Boumans","doi":"10.1093/tas/txae167","DOIUrl":null,"url":null,"abstract":"<p><p>Computer vision techniques are becoming increasingly popular for monitoring pig behavior. For instance, object detection models allow us to detect the presence of pigs, their location, and their posture. The performance of object detection models can be affected by variations in lighting conditions (e.g., intensity, spectrum, and uniformity). Furthermore, lighting conditions can influence pigs' active and resting behavior. In the context of experiments testing different lighting conditions, a detection model was developed to detect the location and postures of group-housed growing-finishing pigs. The objective of this paper is to validate the model developed using YOLOv8 detecting standing, sitting, sternal lying, and lateral lying pigs. Training, validation, and test datasets included annotation of pigs from 10 to 24 wk of age in 10 different light settings; varying in intensity, spectrum, and uniformity. Pig detection was comparable for the different lighting conditions, despite a slightly lower posture agreement for warm light and uneven light distribution, likely due to a less clear contrast between pigs and their background and the presence of shadows. The detection reached a mean average precision (mAP) of 89.4%. Standing was the best-detected posture with the highest precision, sensitivity, and F1 score, while the sensitivity and F1 score of sitting was the lowest. This lower performance resulted from confusion of sitting with sternal lying and standing, as a consequence of the top camera view and a low occurrence of sitting pigs in the annotated dataset. This issue is inherent to pig behavior and could be tackled using data augmentation. Some confusion was reported between types of lying due to occlusion by pen mates or pigs' own bodies, and grouping both types of lying postures resulted in an improvement in the detection (mAP = 97.0%). Therefore, comparing resting postures (both lying types) to active postures could lead to a more reliable interpretation of pigs' behavior. Some detection errors were observed, e.g., two detections for the same pig were generated due to posture uncertainty, dirt on cameras detected as a pig, and undetected pigs due to occlusion. The localization accuracy measured by the intersection over union was higher than 95.5% for 75% of the dataset, meaning that the location of predicted pigs was very close to annotated pigs. Tracking individual pigs revealed challenges with ID changes and switches between pen mates, requiring further work.</p>","PeriodicalId":23272,"journal":{"name":"Translational Animal Science","volume":"8 ","pages":"txae167"},"PeriodicalIF":1.3000,"publicationDate":"2024-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11635830/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Translational Animal Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/tas/txae167","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q3","JCRName":"AGRICULTURE, DAIRY & ANIMAL SCIENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Computer vision techniques are becoming increasingly popular for monitoring pig behavior. For instance, object detection models allow us to detect the presence of pigs, their location, and their posture. The performance of object detection models can be affected by variations in lighting conditions (e.g., intensity, spectrum, and uniformity). Furthermore, lighting conditions can influence pigs' active and resting behavior. In the context of experiments testing different lighting conditions, a detection model was developed to detect the location and postures of group-housed growing-finishing pigs. The objective of this paper is to validate the model developed using YOLOv8 detecting standing, sitting, sternal lying, and lateral lying pigs. Training, validation, and test datasets included annotation of pigs from 10 to 24 wk of age in 10 different light settings; varying in intensity, spectrum, and uniformity. Pig detection was comparable for the different lighting conditions, despite a slightly lower posture agreement for warm light and uneven light distribution, likely due to a less clear contrast between pigs and their background and the presence of shadows. The detection reached a mean average precision (mAP) of 89.4%. Standing was the best-detected posture with the highest precision, sensitivity, and F1 score, while the sensitivity and F1 score of sitting was the lowest. This lower performance resulted from confusion of sitting with sternal lying and standing, as a consequence of the top camera view and a low occurrence of sitting pigs in the annotated dataset. This issue is inherent to pig behavior and could be tackled using data augmentation. Some confusion was reported between types of lying due to occlusion by pen mates or pigs' own bodies, and grouping both types of lying postures resulted in an improvement in the detection (mAP = 97.0%). Therefore, comparing resting postures (both lying types) to active postures could lead to a more reliable interpretation of pigs' behavior. Some detection errors were observed, e.g., two detections for the same pig were generated due to posture uncertainty, dirt on cameras detected as a pig, and undetected pigs due to occlusion. The localization accuracy measured by the intersection over union was higher than 95.5% for 75% of the dataset, meaning that the location of predicted pigs was very close to annotated pigs. Tracking individual pigs revealed challenges with ID changes and switches between pen mates, requiring further work.

使用物体检测模型在不同光照条件下确定猪的姿势和位置。
计算机视觉技术在监测猪的行为方面越来越受欢迎。例如,物体检测模型允许我们检测猪的存在,它们的位置和姿势。物体检测模型的性能会受到光照条件变化(例如,强度、光谱和均匀性)的影响。此外,光照条件会影响猪的活动和休息行为。在测试不同光照条件的实验背景下,开发了一个检测模型来检测群养生长肥育猪的位置和姿势。本文的目的是验证使用YOLOv8检测站立、坐姿、胸骨卧位和侧卧位猪的模型。训练、验证和测试数据集包括10 - 24周龄猪在10种不同光照条件下的注释;变化的在强度、光谱和均匀性上变化的在不同的光照条件下,猪的检测结果具有可比性,尽管在温暖的光线和不均匀的光线分布下,猪的姿势一致性略低,这可能是由于猪与背景之间的对比不太明显以及阴影的存在。检测平均精密度(mAP)为89.4%。站姿的检测精度、灵敏度和F1评分最高,而坐姿的灵敏度和F1评分最低。这种较低的性能是由于坐着与胸骨躺着和站着的混淆,这是由于顶部摄像机视图和注释数据集中坐着猪的低发生率造成的。这个问题是猪的固有行为,可以通过数据增强来解决。据报道,由于猪舍同伴或猪自己的身体遮挡,两种躺姿之间存在混淆,将两种躺姿分组可以提高检测效率(mAP = 97.0%)。因此,比较休息姿势(两种躺着的姿势)和活动姿势可以更可靠地解释猪的行为。观察到一些检测误差,例如,由于姿势不确定,对同一头猪产生了两次检测,相机上的污垢被检测为猪,由于遮挡而未被检测到猪。对于75%的数据集,通过交集/联合测量的定位精度高于95.5%,这意味着预测的猪的位置与注释的猪非常接近。跟踪每头猪发现了ID变化和猪圈伙伴之间切换的挑战,这需要进一步的工作。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Translational Animal Science
Translational Animal Science Veterinary-Veterinary (all)
CiteScore
2.80
自引率
15.40%
发文量
149
审稿时长
8 weeks
期刊介绍: Translational Animal Science (TAS) is the first open access-open review animal science journal, encompassing a broad scope of research topics in animal science. TAS focuses on translating basic science to innovation, and validation of these innovations by various segments of the allied animal industry. Readers of TAS will typically represent education, industry, and government, including research, teaching, administration, extension, management, quality assurance, product development, and technical services. Those interested in TAS typically include animal breeders, economists, embryologists, engineers, food scientists, geneticists, microbiologists, nutritionists, veterinarians, physiologists, processors, public health professionals, and others with an interest in animal production and applied aspects of animal sciences.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信