{"title":"Estimating vegetation index for outdoor free-range pig production using YOLO.","authors":"Sang-Hyon Oh, Hee-Mun Park, Jin-Hyun Park","doi":"10.5187/jast.2023.e41","DOIUrl":null,"url":null,"abstract":"<p><p>The objective of this study was to quantitatively estimate the level of grazing area damage in outdoor free-range pig production using a Unmanned Aerial Vehicles (UAV) with an RGB image sensor. Ten corn field images were captured by a UAV over approximately two weeks, during which gestating sows were allowed to graze freely on the corn field measuring 100 × 50 m<sup>2</sup>. The images were corrected to a bird's-eye view, and then divided into 32 segments and sequentially inputted into the YOLOv4 detector to detect the corn images according to their condition. The 43 raw training images selected randomly out of 320 segmented images were flipped to create 86 images, and then these images were further augmented by rotating them in 5-degree increments to create a total of 6,192 images. The increased 6,192 images are further augmented by applying three random color transformations to each image, resulting in 24,768 datasets. The occupancy rate of corn in the field was estimated efficiently using You Only Look Once (YOLO). As of the first day of observation (day 2), it was evident that almost all the corn had disappeared by the ninth day. When grazing 20 sows in a 50 × 100 m<sup>2</sup> cornfield (250 m<sup>2</sup>/sow), it appears that the animals should be rotated to other grazing areas to protect the cover crop after at least five days. In agricultural technology, most of the research using machine and deep learning is related to the detection of fruits and pests, and research on other application fields is needed. In addition, large-scale image data collected by experts in the field are required as training data to apply deep learning. If the data required for deep learning is insufficient, a large number of data augmentation is required.</p>","PeriodicalId":14923,"journal":{"name":"Journal of Animal Science and Technology","volume":"65 3","pages":"638-651"},"PeriodicalIF":2.7000,"publicationDate":"2023-05-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ftp.ncbi.nlm.nih.gov/pub/pmc/oa_pdf/10/63/jast-65-3-638.PMC10271927.pdf","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Animal Science and Technology","FirstCategoryId":"97","ListUrlMain":"https://doi.org/10.5187/jast.2023.e41","RegionNum":3,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"AGRICULTURE, DAIRY & ANIMAL SCIENCE","Score":null,"Total":0}
引用次数: 1
Abstract
The objective of this study was to quantitatively estimate the level of grazing area damage in outdoor free-range pig production using a Unmanned Aerial Vehicles (UAV) with an RGB image sensor. Ten corn field images were captured by a UAV over approximately two weeks, during which gestating sows were allowed to graze freely on the corn field measuring 100 × 50 m2. The images were corrected to a bird's-eye view, and then divided into 32 segments and sequentially inputted into the YOLOv4 detector to detect the corn images according to their condition. The 43 raw training images selected randomly out of 320 segmented images were flipped to create 86 images, and then these images were further augmented by rotating them in 5-degree increments to create a total of 6,192 images. The increased 6,192 images are further augmented by applying three random color transformations to each image, resulting in 24,768 datasets. The occupancy rate of corn in the field was estimated efficiently using You Only Look Once (YOLO). As of the first day of observation (day 2), it was evident that almost all the corn had disappeared by the ninth day. When grazing 20 sows in a 50 × 100 m2 cornfield (250 m2/sow), it appears that the animals should be rotated to other grazing areas to protect the cover crop after at least five days. In agricultural technology, most of the research using machine and deep learning is related to the detection of fruits and pests, and research on other application fields is needed. In addition, large-scale image data collected by experts in the field are required as training data to apply deep learning. If the data required for deep learning is insufficient, a large number of data augmentation is required.
期刊介绍:
Journal of Animal Science and Technology (J. Anim. Sci. Technol. or JAST) is a peer-reviewed, open access journal publishing original research, review articles and notes in all fields of animal science.
Topics covered by the journal include: genetics and breeding, physiology, nutrition of monogastric animals, nutrition of ruminants, animal products (milk, meat, eggs and their by-products) and their processing, grasslands and roughages, livestock environment, animal biotechnology, animal behavior and welfare.
Articles generally report research involving beef cattle, dairy cattle, pigs, companion animals, goats, horses, and sheep. However, studies involving other farm animals, aquatic and wildlife species, and laboratory animal species that address fundamental questions related to livestock and companion animal biology will also be considered for publication.
The Journal of Animal Science and Technology (J. Anim. Technol. or JAST) has been the official journal of The Korean Society of Animal Science and Technology (KSAST) since 2000, formerly known as The Korean Journal of Animal Sciences (launched in 1956).