YOLO-feed: An advanced lightweight network enabling real-time, high-precision detection of feed pellets on CPU devices and its applications in quantifying individual fish feed intake
Miaosheng Feng , Pengxin Jiang , Yuhang Wang , Shuimu Hu , Sijing Chen , Rui Li , Haibin Huang , Ning Li , Boyu Zhang , Qiaozhen Ke , Yu Zhang , Peng Xu
{"title":"YOLO-feed: An advanced lightweight network enabling real-time, high-precision detection of feed pellets on CPU devices and its applications in quantifying individual fish feed intake","authors":"Miaosheng Feng , Pengxin Jiang , Yuhang Wang , Shuimu Hu , Sijing Chen , Rui Li , Haibin Huang , Ning Li , Boyu Zhang , Qiaozhen Ke , Yu Zhang , Peng Xu","doi":"10.1016/j.aquaculture.2025.742700","DOIUrl":null,"url":null,"abstract":"<div><div>With the rapid advancement of deep learning, its application in aquaculture has grown significantly. However, the widespread use of deep learning models in this field is hindered by the large number of parameters and high computational power requirements. Moreover, accurately quantifying individual feed intake is essential for genetic improvement of feed efficiency, scientific feeding practices, and increasing aquaculture yield. Unfortunately, existing methods struggle to operate in real-time on low-performance hardware. To address these challenges, this study proposes YOLO-Feed, a cost-effective and highly efficient deep learning network. YOLO-Feed significantly reduces model size by eliminating the large object detection head and its corresponding branch network from You Only Look Once (YOLO) v8s, while introducing a powerful intersection-over-union (IoU) loss to enhance performance. We trained the model on over 9000 labeled feed pellet images. The YOLO-Feed model achieved a precision of 0.966, comparable to YOLOv8s, while reducing the number of parameters to less than 1 million— a 12.3-fold compression. In comparative experiments, YOLO-Feed maintained near-optimal precision with minimal parameters and was the only model capable of processing a single image on a CPU in just 16.9 ms. Furthermore, in an indoor aquaculture experiment involving 1027 fish, YOLO-Feed was successfully applied to measure individual feed intake, achieving an average accuracy of 90.3 % over a 733-min experimental period. These results demonstrate the superior performance and scalability of the YOLO-Feed model. To expedite research on the genetic improvement of feeding efficiency, we have open - sourced this project on GitHub (<span><span>https://github.com/miaomiaoge/YOLO-Feed</span><svg><path></path></svg></span>).</div></div>","PeriodicalId":8375,"journal":{"name":"Aquaculture","volume":"608 ","pages":"Article 742700"},"PeriodicalIF":3.9000,"publicationDate":"2025-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Aquaculture","FirstCategoryId":"97","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0044848625005861","RegionNum":1,"RegionCategory":"农林科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"FISHERIES","Score":null,"Total":0}
引用次数: 0
Abstract
With the rapid advancement of deep learning, its application in aquaculture has grown significantly. However, the widespread use of deep learning models in this field is hindered by the large number of parameters and high computational power requirements. Moreover, accurately quantifying individual feed intake is essential for genetic improvement of feed efficiency, scientific feeding practices, and increasing aquaculture yield. Unfortunately, existing methods struggle to operate in real-time on low-performance hardware. To address these challenges, this study proposes YOLO-Feed, a cost-effective and highly efficient deep learning network. YOLO-Feed significantly reduces model size by eliminating the large object detection head and its corresponding branch network from You Only Look Once (YOLO) v8s, while introducing a powerful intersection-over-union (IoU) loss to enhance performance. We trained the model on over 9000 labeled feed pellet images. The YOLO-Feed model achieved a precision of 0.966, comparable to YOLOv8s, while reducing the number of parameters to less than 1 million— a 12.3-fold compression. In comparative experiments, YOLO-Feed maintained near-optimal precision with minimal parameters and was the only model capable of processing a single image on a CPU in just 16.9 ms. Furthermore, in an indoor aquaculture experiment involving 1027 fish, YOLO-Feed was successfully applied to measure individual feed intake, achieving an average accuracy of 90.3 % over a 733-min experimental period. These results demonstrate the superior performance and scalability of the YOLO-Feed model. To expedite research on the genetic improvement of feeding efficiency, we have open - sourced this project on GitHub (https://github.com/miaomiaoge/YOLO-Feed).
随着深度学习的快速发展,其在水产养殖中的应用也得到了显著的发展。然而,深度学习模型在该领域的广泛应用受到大量参数和高计算能力要求的阻碍。此外,准确量化个体采食量对于遗传改良饲料效率、科学饲养和提高水产养殖产量至关重要。不幸的是,现有的方法很难在低性能硬件上实时操作。为了应对这些挑战,本研究提出了一种经济高效的深度学习网络YOLO-Feed。YOLO- feed通过消除You Only Look Once (YOLO) v8s中的大型目标检测头及其相应的分支网络,显著减小了模型尺寸,同时引入了强大的交叉-超合并(IoU)损失来提高性能。我们在9000多个标记饲料颗粒图像上训练模型。YOLO-Feed模型实现了0.966的精度,与YOLOv8s相当,同时将参数数量减少到100万以下-压缩了12.3倍。在对比实验中,YOLO-Feed以最小的参数保持了接近最佳的精度,并且是唯一能够在16.9 ms内在CPU上处理单个图像的模型。此外,在1027条鱼的室内养殖实验中,YOLO-Feed成功地用于测量个体采食量,在733 min的实验周期内,平均准确率达到90.3%。这些结果证明了YOLO-Feed模型的优越性能和可扩展性。为了加快对饲养效率的遗传改进的研究,我们已经在GitHub (https://github.com/miaomiaoge/YOLO-Feed)上开源了这个项目。
期刊介绍:
Aquaculture is an international journal for the exploration, improvement and management of all freshwater and marine food resources. It publishes novel and innovative research of world-wide interest on farming of aquatic organisms, which includes finfish, mollusks, crustaceans and aquatic plants for human consumption. Research on ornamentals is not a focus of the Journal. Aquaculture only publishes papers with a clear relevance to improving aquaculture practices or a potential application.