{"title":"一种先进的辣椒病虫害检测深度学习方法。","authors":"Xuewei Wang, Jun Liu, Qian Chen","doi":"10.1186/s13007-025-01387-4","DOIUrl":null,"url":null,"abstract":"<p><p>Despite the significant progress in deep learning-based object detection, existing models struggle to perform optimally in complex agricultural environments. To address these challenges, this study introduces YOLO-Pepper, an enhanced model designed specifically for greenhouse pepper disease and pest detection, overcoming three key obstacles: small target recognition, multi-scale feature extraction under occlusion, and real-time processing demands. Built upon YOLOv10n, YOLO-Pepper incorporates four major innovations: (1) an Adaptive Multi-Scale Feature Extraction (AMSFE) module that improves feature capture through multi-branch convolutions; (2) a Dynamic Feature Pyramid Network (DFPN) enabling context-aware feature fusion; (3) a specialized Small Detection Head (SDH) tailored for minute targets; and (4) an Inner-CIoU loss function that enhances localization accuracy by 18% compared to standard CIoU. Evaluated on a diverse dataset of 8046 annotated images, YOLO-Pepper achieves state-of-the-art performance, with 94.26% mAP@0.5 at 115.26 FPS, marking an 11.88 percentage point improvement over YOLOv10n (82.38% mAP@0.5) while maintaining a lightweight structure (2.51 M parameters, 5.15 MB model size) optimized for edge deployment. Comparative experiments highlight YOLO-Pepper's superiority over nine benchmark models, particularly in detecting small and occluded targets. By addressing computational inefficiencies and refining small object detection capabilities, YOLO-Pepper provides robust technical support for intelligent agricultural monitoring systems, making it a highly effective tool for early disease detection and integrated pest management in commercial greenhouse operations.</p>","PeriodicalId":20100,"journal":{"name":"Plant Methods","volume":"21 1","pages":"70"},"PeriodicalIF":4.7000,"publicationDate":"2025-05-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12107738/pdf/","citationCount":"0","resultStr":"{\"title\":\"An advanced deep learning method for pepper diseases and pests detection.\",\"authors\":\"Xuewei Wang, Jun Liu, Qian Chen\",\"doi\":\"10.1186/s13007-025-01387-4\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>Despite the significant progress in deep learning-based object detection, existing models struggle to perform optimally in complex agricultural environments. To address these challenges, this study introduces YOLO-Pepper, an enhanced model designed specifically for greenhouse pepper disease and pest detection, overcoming three key obstacles: small target recognition, multi-scale feature extraction under occlusion, and real-time processing demands. Built upon YOLOv10n, YOLO-Pepper incorporates four major innovations: (1) an Adaptive Multi-Scale Feature Extraction (AMSFE) module that improves feature capture through multi-branch convolutions; (2) a Dynamic Feature Pyramid Network (DFPN) enabling context-aware feature fusion; (3) a specialized Small Detection Head (SDH) tailored for minute targets; and (4) an Inner-CIoU loss function that enhances localization accuracy by 18% compared to standard CIoU. Evaluated on a diverse dataset of 8046 annotated images, YOLO-Pepper achieves state-of-the-art performance, with 94.26% mAP@0.5 at 115.26 FPS, marking an 11.88 percentage point improvement over YOLOv10n (82.38% mAP@0.5) while maintaining a lightweight structure (2.51 M parameters, 5.15 MB model size) optimized for edge deployment. Comparative experiments highlight YOLO-Pepper's superiority over nine benchmark models, particularly in detecting small and occluded targets. By addressing computational inefficiencies and refining small object detection capabilities, YOLO-Pepper provides robust technical support for intelligent agricultural monitoring systems, making it a highly effective tool for early disease detection and integrated pest management in commercial greenhouse operations.</p>\",\"PeriodicalId\":20100,\"journal\":{\"name\":\"Plant Methods\",\"volume\":\"21 1\",\"pages\":\"70\"},\"PeriodicalIF\":4.7000,\"publicationDate\":\"2025-05-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12107738/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Plant Methods\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://doi.org/10.1186/s13007-025-01387-4\",\"RegionNum\":2,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"BIOCHEMICAL RESEARCH METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Plant Methods","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.1186/s13007-025-01387-4","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"BIOCHEMICAL RESEARCH METHODS","Score":null,"Total":0}
An advanced deep learning method for pepper diseases and pests detection.
Despite the significant progress in deep learning-based object detection, existing models struggle to perform optimally in complex agricultural environments. To address these challenges, this study introduces YOLO-Pepper, an enhanced model designed specifically for greenhouse pepper disease and pest detection, overcoming three key obstacles: small target recognition, multi-scale feature extraction under occlusion, and real-time processing demands. Built upon YOLOv10n, YOLO-Pepper incorporates four major innovations: (1) an Adaptive Multi-Scale Feature Extraction (AMSFE) module that improves feature capture through multi-branch convolutions; (2) a Dynamic Feature Pyramid Network (DFPN) enabling context-aware feature fusion; (3) a specialized Small Detection Head (SDH) tailored for minute targets; and (4) an Inner-CIoU loss function that enhances localization accuracy by 18% compared to standard CIoU. Evaluated on a diverse dataset of 8046 annotated images, YOLO-Pepper achieves state-of-the-art performance, with 94.26% mAP@0.5 at 115.26 FPS, marking an 11.88 percentage point improvement over YOLOv10n (82.38% mAP@0.5) while maintaining a lightweight structure (2.51 M parameters, 5.15 MB model size) optimized for edge deployment. Comparative experiments highlight YOLO-Pepper's superiority over nine benchmark models, particularly in detecting small and occluded targets. By addressing computational inefficiencies and refining small object detection capabilities, YOLO-Pepper provides robust technical support for intelligent agricultural monitoring systems, making it a highly effective tool for early disease detection and integrated pest management in commercial greenhouse operations.
期刊介绍:
Plant Methods is an open access, peer-reviewed, online journal for the plant research community that encompasses all aspects of technological innovation in the plant sciences.
There is no doubt that we have entered an exciting new era in plant biology. The completion of the Arabidopsis genome sequence, and the rapid progress being made in other plant genomics projects are providing unparalleled opportunities for progress in all areas of plant science. Nevertheless, enormous challenges lie ahead if we are to understand the function of every gene in the genome, and how the individual parts work together to make the whole organism. Achieving these goals will require an unprecedented collaborative effort, combining high-throughput, system-wide technologies with more focused approaches that integrate traditional disciplines such as cell biology, biochemistry and molecular genetics.
Technological innovation is probably the most important catalyst for progress in any scientific discipline. Plant Methods’ goal is to stimulate the development and adoption of new and improved techniques and research tools and, where appropriate, to promote consistency of methodologies for better integration of data from different laboratories.