{"title":"基于改进型 YOLOv9-DeepSORT 的鸟类和无人机识别检测与跟踪","authors":"Jincan Zhu;Chenhao Ma;Jian Rong;Yong Cao","doi":"10.1109/ACCESS.2024.3475629","DOIUrl":null,"url":null,"abstract":"At present, the protection of birds, especially endangered birds, faces major challenges. In the process of protection, birds are often mixed with various drones, and it is difficult to accurately count the number of endangered birds, especially at night, which brings great difficulties to bird protection work. So tracking and identifying birds and drones is essential to ensure the accuracy and efficiency of bird conservation efforts. To solve these problems, this paper proposes a new multi-target tracking (MOT) model based on the combination of YOLOv9 detection algorithm and DeepSORT tracking algorithm. Firstly, the original RepNSCPELAN4 module is replaced by CAM context feature enhancement module in Backbone to improve the model’s ability to extract small target features. Following this, the AFF channel attention mechanism has been integrated with RepNSCPELAN4 in the Head section to create the RepNSCPELAN4-AFF module, which aims to better address semantic and scale inconsistencies. Finally, a new RepNSCPELAN4-AKConv module has been developed using the AKConv dynamic Convolution module to replace the RepNSCPELAN4 module in the original Head section, enabling the model to more effectively capture detailed and contextual information. In the bird-UAV visible light comprehensive dataset proposed in this study, the mAP0.50 and F1 Score of all categories are 81.3% and 71.9% respectively by the improved YOLOv9-DeepSORT model. The mAP0.50 and F1 scores of individual birds are 89.1% and 82.4%, respectively. Compared to the Basic YOLOv9 model, the former improves by 7.9% and 5.3%, while the latter improves by 23.9% and 17.0%. On infrared datasets, compared to the original model, the mAP0.50 and F1 scores of the improved model improved by 3.2% and 1.4% across all categories compared to the original model. The average accuracy of identifying individual birds and similarly shaped fixed-wing drones also improved by 2.2% and 7.5% respectively. Moreover, on the mixed visible light and infrared data sets, the model get mAP0.50 of 81.8% higher 0.9% than that of the YOLOv9. These experiments demonstrate the improved YOLOv9-DeepSORT method can expand the multiscene application range of bird recognition and tracking models, effectively promoting the extraction of video frame features in multi-target tracking.","PeriodicalId":13079,"journal":{"name":"IEEE Access","volume":"12 ","pages":"147942-147957"},"PeriodicalIF":3.4000,"publicationDate":"2024-10-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10706910","citationCount":"0","resultStr":"{\"title\":\"Bird and UAVs Recognition Detection and Tracking Based on Improved YOLOv9-DeepSORT\",\"authors\":\"Jincan Zhu;Chenhao Ma;Jian Rong;Yong Cao\",\"doi\":\"10.1109/ACCESS.2024.3475629\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"At present, the protection of birds, especially endangered birds, faces major challenges. In the process of protection, birds are often mixed with various drones, and it is difficult to accurately count the number of endangered birds, especially at night, which brings great difficulties to bird protection work. So tracking and identifying birds and drones is essential to ensure the accuracy and efficiency of bird conservation efforts. To solve these problems, this paper proposes a new multi-target tracking (MOT) model based on the combination of YOLOv9 detection algorithm and DeepSORT tracking algorithm. Firstly, the original RepNSCPELAN4 module is replaced by CAM context feature enhancement module in Backbone to improve the model’s ability to extract small target features. Following this, the AFF channel attention mechanism has been integrated with RepNSCPELAN4 in the Head section to create the RepNSCPELAN4-AFF module, which aims to better address semantic and scale inconsistencies. Finally, a new RepNSCPELAN4-AKConv module has been developed using the AKConv dynamic Convolution module to replace the RepNSCPELAN4 module in the original Head section, enabling the model to more effectively capture detailed and contextual information. In the bird-UAV visible light comprehensive dataset proposed in this study, the mAP0.50 and F1 Score of all categories are 81.3% and 71.9% respectively by the improved YOLOv9-DeepSORT model. The mAP0.50 and F1 scores of individual birds are 89.1% and 82.4%, respectively. Compared to the Basic YOLOv9 model, the former improves by 7.9% and 5.3%, while the latter improves by 23.9% and 17.0%. On infrared datasets, compared to the original model, the mAP0.50 and F1 scores of the improved model improved by 3.2% and 1.4% across all categories compared to the original model. The average accuracy of identifying individual birds and similarly shaped fixed-wing drones also improved by 2.2% and 7.5% respectively. Moreover, on the mixed visible light and infrared data sets, the model get mAP0.50 of 81.8% higher 0.9% than that of the YOLOv9. These experiments demonstrate the improved YOLOv9-DeepSORT method can expand the multiscene application range of bird recognition and tracking models, effectively promoting the extraction of video frame features in multi-target tracking.\",\"PeriodicalId\":13079,\"journal\":{\"name\":\"IEEE Access\",\"volume\":\"12 \",\"pages\":\"147942-147957\"},\"PeriodicalIF\":3.4000,\"publicationDate\":\"2024-10-07\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=10706910\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Access\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10706910/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Access","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10706910/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Bird and UAVs Recognition Detection and Tracking Based on Improved YOLOv9-DeepSORT
At present, the protection of birds, especially endangered birds, faces major challenges. In the process of protection, birds are often mixed with various drones, and it is difficult to accurately count the number of endangered birds, especially at night, which brings great difficulties to bird protection work. So tracking and identifying birds and drones is essential to ensure the accuracy and efficiency of bird conservation efforts. To solve these problems, this paper proposes a new multi-target tracking (MOT) model based on the combination of YOLOv9 detection algorithm and DeepSORT tracking algorithm. Firstly, the original RepNSCPELAN4 module is replaced by CAM context feature enhancement module in Backbone to improve the model’s ability to extract small target features. Following this, the AFF channel attention mechanism has been integrated with RepNSCPELAN4 in the Head section to create the RepNSCPELAN4-AFF module, which aims to better address semantic and scale inconsistencies. Finally, a new RepNSCPELAN4-AKConv module has been developed using the AKConv dynamic Convolution module to replace the RepNSCPELAN4 module in the original Head section, enabling the model to more effectively capture detailed and contextual information. In the bird-UAV visible light comprehensive dataset proposed in this study, the mAP0.50 and F1 Score of all categories are 81.3% and 71.9% respectively by the improved YOLOv9-DeepSORT model. The mAP0.50 and F1 scores of individual birds are 89.1% and 82.4%, respectively. Compared to the Basic YOLOv9 model, the former improves by 7.9% and 5.3%, while the latter improves by 23.9% and 17.0%. On infrared datasets, compared to the original model, the mAP0.50 and F1 scores of the improved model improved by 3.2% and 1.4% across all categories compared to the original model. The average accuracy of identifying individual birds and similarly shaped fixed-wing drones also improved by 2.2% and 7.5% respectively. Moreover, on the mixed visible light and infrared data sets, the model get mAP0.50 of 81.8% higher 0.9% than that of the YOLOv9. These experiments demonstrate the improved YOLOv9-DeepSORT method can expand the multiscene application range of bird recognition and tracking models, effectively promoting the extraction of video frame features in multi-target tracking.
IEEE AccessCOMPUTER SCIENCE, INFORMATION SYSTEMSENGIN-ENGINEERING, ELECTRICAL & ELECTRONIC
CiteScore
9.80
自引率
7.70%
发文量
6673
审稿时长
6 weeks
期刊介绍:
IEEE Access® is a multidisciplinary, open access (OA), applications-oriented, all-electronic archival journal that continuously presents the results of original research or development across all of IEEE''s fields of interest.
IEEE Access will publish articles that are of high interest to readers, original, technically correct, and clearly presented. Supported by author publication charges (APC), its hallmarks are a rapid peer review and publication process with open access to all readers. Unlike IEEE''s traditional Transactions or Journals, reviews are "binary", in that reviewers will either Accept or Reject an article in the form it is submitted in order to achieve rapid turnaround. Especially encouraged are submissions on:
Multidisciplinary topics, or applications-oriented articles and negative results that do not fit within the scope of IEEE''s traditional journals.
Practical articles discussing new experiments or measurement techniques, interesting solutions to engineering.
Development of new or improved fabrication or manufacturing techniques.
Reviews or survey articles of new or evolving fields oriented to assist others in understanding the new area.