基于骨架的图像特征提取,用于人兽关系测试中的自动行为分析

IF 2.2 2区 农林科学 Q1 AGRICULTURE, DAIRY & ANIMAL SCIENCE
Maciej Oczak , Jean-Loup Rault , Suzanne Truong , Oceane Schmitt
{"title":"基于骨架的图像特征提取,用于人兽关系测试中的自动行为分析","authors":"Maciej Oczak ,&nbsp;Jean-Loup Rault ,&nbsp;Suzanne Truong ,&nbsp;Oceane Schmitt","doi":"10.1016/j.applanim.2024.106347","DOIUrl":null,"url":null,"abstract":"<div><p>Arena tests are used to address various research questions related to animal behavior and human-animal relationships; e.g. how animals perceive specific human beings or people in general. Recent advancements in computer vision, specifically in application of key point detection models, might offer a possibility to extract variables that are the most often recorded in these tests in an automated way. The objective of this study was to measure two variables in human-pig arena test with computer vision techniques, i.e. distance between the subjects and pig’s visual attention proxy towards pen areas including a human. Human-pig interaction tests were organized inside a test arena measuring 147 × 168 cm. Thirty female pigs took part in the arena tests from 8 to 11 weeks of age, for a total of 210 tests (7 tests per pig), each with a 10-min duration. In total, 35 hours of human-pig interaction tests were video-recorded. To automatically detect human and pig skeletons, 4 models were trained on 100 images of labeled data, i.e. two YOLOv8 models to detect human and pig locations and two VitPose models to detect their skeletons. Models were validated on 50 images. The best performing models were selected to extract human and pig skeletons on recorded videos. Human-pig distance was calculated as the shortest Euclidean distance between all key points of the human and the pig. Visual attention proxy towards selected areas of the arena were calculated by extracting the pig’s head direction and calculating the intersection of a line indicating the heads direction and lines specifying the areas i.e. either lines of the quadrangles for the entrance and the window or lines joining the key points of the human skeleton. The performance of the YOLOv8 for detection of the human and the pig was 0.86 mAP and 0.85 mAP, respectively, and for the VitPose model 0.65 mAP and 0.78 mAP, respectively. The average distance between the human and the pig was 31.03 cm (SD = 35.99). Out of the three predefined areas in the arena, pigs spend most of their time with their head directed toward the human, i.e. 12 hrs 11 min (34.83 % of test duration). The developed method could be applied in human-animal relationship tests to automatically measure the distance between a human and a pig or another animal, visual attention proxy or other variables of interest.</p></div>","PeriodicalId":8222,"journal":{"name":"Applied Animal Behaviour Science","volume":"277 ","pages":"Article 106347"},"PeriodicalIF":2.2000,"publicationDate":"2024-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0168159124001953/pdfft?md5=87ecbe2be3a80a8aae57b193fa0a1d0b&pid=1-s2.0-S0168159124001953-main.pdf","citationCount":"0","resultStr":"{\"title\":\"Skeleton-based image feature extraction for automated behavioral analysis in human-animal relationship tests\",\"authors\":\"Maciej Oczak ,&nbsp;Jean-Loup Rault ,&nbsp;Suzanne Truong ,&nbsp;Oceane Schmitt\",\"doi\":\"10.1016/j.applanim.2024.106347\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Arena tests are used to address various research questions related to animal behavior and human-animal relationships; e.g. how animals perceive specific human beings or people in general. Recent advancements in computer vision, specifically in application of key point detection models, might offer a possibility to extract variables that are the most often recorded in these tests in an automated way. The objective of this study was to measure two variables in human-pig arena test with computer vision techniques, i.e. distance between the subjects and pig’s visual attention proxy towards pen areas including a human. Human-pig interaction tests were organized inside a test arena measuring 147 × 168 cm. Thirty female pigs took part in the arena tests from 8 to 11 weeks of age, for a total of 210 tests (7 tests per pig), each with a 10-min duration. In total, 35 hours of human-pig interaction tests were video-recorded. To automatically detect human and pig skeletons, 4 models were trained on 100 images of labeled data, i.e. two YOLOv8 models to detect human and pig locations and two VitPose models to detect their skeletons. Models were validated on 50 images. The best performing models were selected to extract human and pig skeletons on recorded videos. Human-pig distance was calculated as the shortest Euclidean distance between all key points of the human and the pig. Visual attention proxy towards selected areas of the arena were calculated by extracting the pig’s head direction and calculating the intersection of a line indicating the heads direction and lines specifying the areas i.e. either lines of the quadrangles for the entrance and the window or lines joining the key points of the human skeleton. The performance of the YOLOv8 for detection of the human and the pig was 0.86 mAP and 0.85 mAP, respectively, and for the VitPose model 0.65 mAP and 0.78 mAP, respectively. The average distance between the human and the pig was 31.03 cm (SD = 35.99). Out of the three predefined areas in the arena, pigs spend most of their time with their head directed toward the human, i.e. 12 hrs 11 min (34.83 % of test duration). The developed method could be applied in human-animal relationship tests to automatically measure the distance between a human and a pig or another animal, visual attention proxy or other variables of interest.</p></div>\",\"PeriodicalId\":8222,\"journal\":{\"name\":\"Applied Animal Behaviour Science\",\"volume\":\"277 \",\"pages\":\"Article 106347\"},\"PeriodicalIF\":2.2000,\"publicationDate\":\"2024-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.sciencedirect.com/science/article/pii/S0168159124001953/pdfft?md5=87ecbe2be3a80a8aae57b193fa0a1d0b&pid=1-s2.0-S0168159124001953-main.pdf\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Applied Animal Behaviour Science\",\"FirstCategoryId\":\"97\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0168159124001953\",\"RegionNum\":2,\"RegionCategory\":\"农林科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"AGRICULTURE, DAIRY & ANIMAL SCIENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Applied Animal Behaviour Science","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0168159124001953","RegionNum":2,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, DAIRY & ANIMAL SCIENCE","Score":null,"Total":0}
引用次数: 0

摘要

竞技场测试用于解决与动物行为和人与动物关系有关的各种研究问题,例如动物如何感知特定的人或一般的人。计算机视觉领域的最新进展,特别是关键点检测模型的应用,为自动提取这些测试中最常记录的变量提供了可能。本研究的目的是利用计算机视觉技术测量人猪竞技场测试中的两个变量,即受试者之间的距离和猪对包括人在内的围栏区域的视觉注意力代理。人猪互动测试在一个 147 × 168 厘米的测试场内进行。30 头雌性猪从 8 周龄到 11 周龄参加了场内测试,共进行了 210 次测试(每头猪 7 次),每次测试持续 10 分钟。总共对 35 个小时的人猪互动测试进行了录像。为了自动检测人和猪的骨骼,我们在 100 张标注数据的图像上训练了 4 个模型,即两个 YOLOv8 模型用于检测人和猪的位置,两个 VitPose 模型用于检测人和猪的骨骼。模型在 50 幅图像上进行了验证。选出性能最好的模型,用于提取录制视频中的人和猪的骨骼。人猪距离是根据人和猪的所有关键点之间最短的欧氏距离计算得出的。通过提取猪的头部方向并计算表示头部方向的直线与指定区域的直线(即入口和窗口的四边形线或连接人体骨骼关键点的直线)的交点,计算出对场馆选定区域的视觉注意力代理。YOLOv8 对人和猪的检测结果分别为 0.86 mAP 和 0.85 mAP,VitPose 模型的检测结果分别为 0.65 mAP 和 0.78 mAP。人与猪之间的平均距离为 31.03 厘米(SD = 35.99)。在竞技场的三个预定区域中,猪的大部分时间都是头朝向人,即 12 小时 11 分钟(占测试时间的 34.83%)。所开发的方法可用于人与动物关系测试,自动测量人与猪或其他动物之间的距离、视觉注意力代理或其他相关变量。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Skeleton-based image feature extraction for automated behavioral analysis in human-animal relationship tests

Arena tests are used to address various research questions related to animal behavior and human-animal relationships; e.g. how animals perceive specific human beings or people in general. Recent advancements in computer vision, specifically in application of key point detection models, might offer a possibility to extract variables that are the most often recorded in these tests in an automated way. The objective of this study was to measure two variables in human-pig arena test with computer vision techniques, i.e. distance between the subjects and pig’s visual attention proxy towards pen areas including a human. Human-pig interaction tests were organized inside a test arena measuring 147 × 168 cm. Thirty female pigs took part in the arena tests from 8 to 11 weeks of age, for a total of 210 tests (7 tests per pig), each with a 10-min duration. In total, 35 hours of human-pig interaction tests were video-recorded. To automatically detect human and pig skeletons, 4 models were trained on 100 images of labeled data, i.e. two YOLOv8 models to detect human and pig locations and two VitPose models to detect their skeletons. Models were validated on 50 images. The best performing models were selected to extract human and pig skeletons on recorded videos. Human-pig distance was calculated as the shortest Euclidean distance between all key points of the human and the pig. Visual attention proxy towards selected areas of the arena were calculated by extracting the pig’s head direction and calculating the intersection of a line indicating the heads direction and lines specifying the areas i.e. either lines of the quadrangles for the entrance and the window or lines joining the key points of the human skeleton. The performance of the YOLOv8 for detection of the human and the pig was 0.86 mAP and 0.85 mAP, respectively, and for the VitPose model 0.65 mAP and 0.78 mAP, respectively. The average distance between the human and the pig was 31.03 cm (SD = 35.99). Out of the three predefined areas in the arena, pigs spend most of their time with their head directed toward the human, i.e. 12 hrs 11 min (34.83 % of test duration). The developed method could be applied in human-animal relationship tests to automatically measure the distance between a human and a pig or another animal, visual attention proxy or other variables of interest.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Applied Animal Behaviour Science
Applied Animal Behaviour Science 农林科学-行为科学
CiteScore
4.40
自引率
21.70%
发文量
191
审稿时长
18.1 weeks
期刊介绍: This journal publishes relevant information on the behaviour of domesticated and utilized animals. Topics covered include: -Behaviour of farm, zoo and laboratory animals in relation to animal management and welfare -Behaviour of companion animals in relation to behavioural problems, for example, in relation to the training of dogs for different purposes, in relation to behavioural problems -Studies of the behaviour of wild animals when these studies are relevant from an applied perspective, for example in relation to wildlife management, pest management or nature conservation -Methodological studies within relevant fields The principal subjects are farm, companion and laboratory animals, including, of course, poultry. The journal also deals with the following animal subjects: -Those involved in any farming system, e.g. deer, rabbits and fur-bearing animals -Those in ANY form of confinement, e.g. zoos, safari parks and other forms of display -Feral animals, and any animal species which impinge on farming operations, e.g. as causes of loss or damage -Species used for hunting, recreation etc. may also be considered as acceptable subjects in some instances -Laboratory animals, if the material relates to their behavioural requirements
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信