Yuanyin Luo, Yang Liu, Haorui Wang, Haifei Chen, Kai Liao, Lijun Li
{"title":"YOLO-CFruit:一种在复杂环境中对油茶果实进行鲁棒性物体检测的方法。","authors":"Yuanyin Luo, Yang Liu, Haorui Wang, Haifei Chen, Kai Liao, Lijun Li","doi":"10.3389/fpls.2024.1389961","DOIUrl":null,"url":null,"abstract":"<p><strong>Introduction: </strong>In the field of agriculture, automated harvesting of <i>Camellia oleifera</i> fruit has become an important research area. However, accurately detecting <i>Camellia oleifera</i> fruit in a natural environment is a challenging task. The task of accurately detecting <i>Camellia oleifera</i> fruit in natural environments is complex due to factors such as shadows, which can impede the performance of traditional detection techniques, highlighting the need for more robust methods.</p><p><strong>Methods: </strong>To overcome these challenges, we propose an efficient deep learning method called YOLO-CFruit, which is specifically designed to accurately detect Camellia oleifera fruits in challenging natural environments. First, we collected images of <i>Camellia oleifera</i> fruits and created a dataset, and then used a data enhancement method to further enhance the diversity of the dataset. Our YOLO-CFruit model combines a CBAM module for identifying regions of interest in landscapes with Camellia oleifera fruit and a CSP module with Transformer for capturing global information. In addition, we improve YOLOCFruit by replacing the CIoU Loss with the EIoU Loss in the original YOLOv5.</p><p><strong>Results: </strong>By testing the training network, we find that the method performs well, achieving an average precision of 98.2%, a recall of 94.5%, an accuracy of 98%, an F1 score of 96.2, and a frame rate of 19.02 ms. The experimental results show that our method improves the average precision by 1.2% and achieves the highest accuracy and higher F1 score among all state-of-the-art networks compared to the conventional YOLOv5s network.</p><p><strong>Discussion: </strong>The robust performance of YOLO-CFruit under different real-world conditions, including different light and shading scenarios, signifies its high reliability and lays a solid foundation for the development of automated picking devices.</p>","PeriodicalId":12632,"journal":{"name":"Frontiers in Plant Science","volume":"15 ","pages":"1389961"},"PeriodicalIF":4.1000,"publicationDate":"2024-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443175/pdf/","citationCount":"0","resultStr":"{\"title\":\"YOLO-CFruit: a robust object detection method for <i>Camellia oleifera</i> fruit in complex environments.\",\"authors\":\"Yuanyin Luo, Yang Liu, Haorui Wang, Haifei Chen, Kai Liao, Lijun Li\",\"doi\":\"10.3389/fpls.2024.1389961\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><strong>Introduction: </strong>In the field of agriculture, automated harvesting of <i>Camellia oleifera</i> fruit has become an important research area. However, accurately detecting <i>Camellia oleifera</i> fruit in a natural environment is a challenging task. The task of accurately detecting <i>Camellia oleifera</i> fruit in natural environments is complex due to factors such as shadows, which can impede the performance of traditional detection techniques, highlighting the need for more robust methods.</p><p><strong>Methods: </strong>To overcome these challenges, we propose an efficient deep learning method called YOLO-CFruit, which is specifically designed to accurately detect Camellia oleifera fruits in challenging natural environments. First, we collected images of <i>Camellia oleifera</i> fruits and created a dataset, and then used a data enhancement method to further enhance the diversity of the dataset. Our YOLO-CFruit model combines a CBAM module for identifying regions of interest in landscapes with Camellia oleifera fruit and a CSP module with Transformer for capturing global information. In addition, we improve YOLOCFruit by replacing the CIoU Loss with the EIoU Loss in the original YOLOv5.</p><p><strong>Results: </strong>By testing the training network, we find that the method performs well, achieving an average precision of 98.2%, a recall of 94.5%, an accuracy of 98%, an F1 score of 96.2, and a frame rate of 19.02 ms. The experimental results show that our method improves the average precision by 1.2% and achieves the highest accuracy and higher F1 score among all state-of-the-art networks compared to the conventional YOLOv5s network.</p><p><strong>Discussion: </strong>The robust performance of YOLO-CFruit under different real-world conditions, including different light and shading scenarios, signifies its high reliability and lays a solid foundation for the development of automated picking devices.</p>\",\"PeriodicalId\":12632,\"journal\":{\"name\":\"Frontiers in Plant Science\",\"volume\":\"15 \",\"pages\":\"1389961\"},\"PeriodicalIF\":4.1000,\"publicationDate\":\"2024-08-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11443175/pdf/\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Frontiers in Plant Science\",\"FirstCategoryId\":\"99\",\"ListUrlMain\":\"https://doi.org/10.3389/fpls.2024.1389961\",\"RegionNum\":2,\"RegionCategory\":\"生物学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/1/1 0:00:00\",\"PubModel\":\"eCollection\",\"JCR\":\"Q1\",\"JCRName\":\"PLANT SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Frontiers in Plant Science","FirstCategoryId":"99","ListUrlMain":"https://doi.org/10.3389/fpls.2024.1389961","RegionNum":2,"RegionCategory":"生物学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/1/1 0:00:00","PubModel":"eCollection","JCR":"Q1","JCRName":"PLANT SCIENCES","Score":null,"Total":0}
YOLO-CFruit: a robust object detection method for Camellia oleifera fruit in complex environments.
Introduction: In the field of agriculture, automated harvesting of Camellia oleifera fruit has become an important research area. However, accurately detecting Camellia oleifera fruit in a natural environment is a challenging task. The task of accurately detecting Camellia oleifera fruit in natural environments is complex due to factors such as shadows, which can impede the performance of traditional detection techniques, highlighting the need for more robust methods.
Methods: To overcome these challenges, we propose an efficient deep learning method called YOLO-CFruit, which is specifically designed to accurately detect Camellia oleifera fruits in challenging natural environments. First, we collected images of Camellia oleifera fruits and created a dataset, and then used a data enhancement method to further enhance the diversity of the dataset. Our YOLO-CFruit model combines a CBAM module for identifying regions of interest in landscapes with Camellia oleifera fruit and a CSP module with Transformer for capturing global information. In addition, we improve YOLOCFruit by replacing the CIoU Loss with the EIoU Loss in the original YOLOv5.
Results: By testing the training network, we find that the method performs well, achieving an average precision of 98.2%, a recall of 94.5%, an accuracy of 98%, an F1 score of 96.2, and a frame rate of 19.02 ms. The experimental results show that our method improves the average precision by 1.2% and achieves the highest accuracy and higher F1 score among all state-of-the-art networks compared to the conventional YOLOv5s network.
Discussion: The robust performance of YOLO-CFruit under different real-world conditions, including different light and shading scenarios, signifies its high reliability and lays a solid foundation for the development of automated picking devices.
期刊介绍:
In an ever changing world, plant science is of the utmost importance for securing the future well-being of humankind. Plants provide oxygen, food, feed, fibers, and building materials. In addition, they are a diverse source of industrial and pharmaceutical chemicals. Plants are centrally important to the health of ecosystems, and their understanding is critical for learning how to manage and maintain a sustainable biosphere. Plant science is extremely interdisciplinary, reaching from agricultural science to paleobotany, and molecular physiology to ecology. It uses the latest developments in computer science, optics, molecular biology and genomics to address challenges in model systems, agricultural crops, and ecosystems. Plant science research inquires into the form, function, development, diversity, reproduction, evolution and uses of both higher and lower plants and their interactions with other organisms throughout the biosphere. Frontiers in Plant Science welcomes outstanding contributions in any field of plant science from basic to applied research, from organismal to molecular studies, from single plant analysis to studies of populations and whole ecosystems, and from molecular to biophysical to computational approaches.
Frontiers in Plant Science publishes articles on the most outstanding discoveries across a wide research spectrum of Plant Science. The mission of Frontiers in Plant Science is to bring all relevant Plant Science areas together on a single platform.