Dehao Dong, Jianzhuang Li, Haiying Liu, Lixia Deng, Jason Gu, Lida Liu, Shuang Li
求助PDF
{"title":"EA-YOLO:高效准确的无人飞行器图像目标检测算法","authors":"Dehao Dong, Jianzhuang Li, Haiying Liu, Lixia Deng, Jason Gu, Lida Liu, Shuang Li","doi":"10.1002/tee.24180","DOIUrl":null,"url":null,"abstract":"An improved EA‐YOLO object detection algorithm based on YOLOv5 is proposed to address the issues of drastic changes in target scale, low detection accuracy, and high miss rate in unmanned aerial vehicle aerial photography scenarios. Firstly, a DFE module was proposed to improve the effectiveness of feature extraction and enhance the whole model's ability to learn residual features. Secondly, a CWFF architecture was introduced to enable deeper feature fusion and improve the effectiveness of feature fusion. Finally, in order to solve the traditional algorithm's shortcomings it is difficult to detect small targets. We have designed a novel SDS structure and adopted a strategy of reusing low‐level feature maps to enhance the network's ability to detect small targets, making it more suitable for detecting some small objects in drone images. Experiments in the VisDrone2019 dataset demonstrated that the proposed EA‐YOLOs achieved an average accuracy mAP@0.5 of 39.9%, which is an 8% improvement over YOLOv5s, and mAP@0.5:0.95 of 22.2%, which is 5.2% improvement over the original algorithm. Compared with YOLOv3, YOLOv5l, and YOLOv8s, the mAP@0.5 of EA‐YOLOs improved by 0.9%, 1.8%, and 0.6%, while the GFLOPs decreased by 86.4%, 80.6%, and 26.7%. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.","PeriodicalId":13435,"journal":{"name":"IEEJ Transactions on Electrical and Electronic Engineering","volume":"26 1","pages":""},"PeriodicalIF":1.0000,"publicationDate":"2024-08-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"EA‐YOLO: An Efficient and Accurate UAV Image Object Detection Algorithm\",\"authors\":\"Dehao Dong, Jianzhuang Li, Haiying Liu, Lixia Deng, Jason Gu, Lida Liu, Shuang Li\",\"doi\":\"10.1002/tee.24180\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"An improved EA‐YOLO object detection algorithm based on YOLOv5 is proposed to address the issues of drastic changes in target scale, low detection accuracy, and high miss rate in unmanned aerial vehicle aerial photography scenarios. Firstly, a DFE module was proposed to improve the effectiveness of feature extraction and enhance the whole model's ability to learn residual features. Secondly, a CWFF architecture was introduced to enable deeper feature fusion and improve the effectiveness of feature fusion. Finally, in order to solve the traditional algorithm's shortcomings it is difficult to detect small targets. We have designed a novel SDS structure and adopted a strategy of reusing low‐level feature maps to enhance the network's ability to detect small targets, making it more suitable for detecting some small objects in drone images. Experiments in the VisDrone2019 dataset demonstrated that the proposed EA‐YOLOs achieved an average accuracy mAP@0.5 of 39.9%, which is an 8% improvement over YOLOv5s, and mAP@0.5:0.95 of 22.2%, which is 5.2% improvement over the original algorithm. Compared with YOLOv3, YOLOv5l, and YOLOv8s, the mAP@0.5 of EA‐YOLOs improved by 0.9%, 1.8%, and 0.6%, while the GFLOPs decreased by 86.4%, 80.6%, and 26.7%. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.\",\"PeriodicalId\":13435,\"journal\":{\"name\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"volume\":\"26 1\",\"pages\":\"\"},\"PeriodicalIF\":1.0000,\"publicationDate\":\"2024-08-26\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEJ Transactions on Electrical and Electronic Engineering\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1002/tee.24180\",\"RegionNum\":4,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEJ Transactions on Electrical and Electronic Engineering","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1002/tee.24180","RegionNum":4,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0
引用
批量引用
EA‐YOLO: An Efficient and Accurate UAV Image Object Detection Algorithm
An improved EA‐YOLO object detection algorithm based on YOLOv5 is proposed to address the issues of drastic changes in target scale, low detection accuracy, and high miss rate in unmanned aerial vehicle aerial photography scenarios. Firstly, a DFE module was proposed to improve the effectiveness of feature extraction and enhance the whole model's ability to learn residual features. Secondly, a CWFF architecture was introduced to enable deeper feature fusion and improve the effectiveness of feature fusion. Finally, in order to solve the traditional algorithm's shortcomings it is difficult to detect small targets. We have designed a novel SDS structure and adopted a strategy of reusing low‐level feature maps to enhance the network's ability to detect small targets, making it more suitable for detecting some small objects in drone images. Experiments in the VisDrone2019 dataset demonstrated that the proposed EA‐YOLOs achieved an average accuracy mAP@0.5 of 39.9%, which is an 8% improvement over YOLOv5s, and mAP@0.5:0.95 of 22.2%, which is 5.2% improvement over the original algorithm. Compared with YOLOv3, YOLOv5l, and YOLOv8s, the mAP@0.5 of EA‐YOLOs improved by 0.9%, 1.8%, and 0.6%, while the GFLOPs decreased by 86.4%, 80.6%, and 26.7%. © 2024 Institute of Electrical Engineers of Japan and Wiley Periodicals LLC.