{"title":"DetDSHAP:具有Shapley值的无人和自主无人机的可解释目标检测","authors":"Maxwell Hogan, Nabil Aouf","doi":"10.1049/rsn2.70042","DOIUrl":null,"url":null,"abstract":"<p>Automatic object detection onboard drones is essential for facilitating autonomous operations. The advent of deep learning techniques has significantly enhanced the efficacy of object detection and recognition systems. However, the implementation of deep networks in real-world operational settings for autonomous decision-making presents several challenges. A primary concern is the lack of transparency in deep learning algorithms, which renders their behaviour unreliable to both practitioners and the general public. Additionally, deep networks often require substantial computational resources, which may not be feasible for many compact portable platforms. This paper aims to address these challenges and promote the integration of deep object detectors in drone applications. We present a novel interpretative framework, DetDSHAP, designed to elucidate the predictions generated by the YOLOv5 detector. Furthermore, we propose utilising the contribution scores derived from our explanatory model as an innovative pruning technique for the YOLOv5 network, thereby achieving enhanced performance while minimising computational demands. Lastly, we provide performance evaluations of our approach demonstrating its efficiency across various datasets, including real data collected from drone-mounted cameras and established public benchmark datasets.</p>","PeriodicalId":50377,"journal":{"name":"Iet Radar Sonar and Navigation","volume":"19 1","pages":""},"PeriodicalIF":1.4000,"publicationDate":"2025-06-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://onlinelibrary.wiley.com/doi/epdf/10.1049/rsn2.70042","citationCount":"0","resultStr":"{\"title\":\"DetDSHAP: Explainable Object Detection for Uncrewed and Autonomous Drones With Shapley Values\",\"authors\":\"Maxwell Hogan, Nabil Aouf\",\"doi\":\"10.1049/rsn2.70042\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Automatic object detection onboard drones is essential for facilitating autonomous operations. The advent of deep learning techniques has significantly enhanced the efficacy of object detection and recognition systems. However, the implementation of deep networks in real-world operational settings for autonomous decision-making presents several challenges. A primary concern is the lack of transparency in deep learning algorithms, which renders their behaviour unreliable to both practitioners and the general public. Additionally, deep networks often require substantial computational resources, which may not be feasible for many compact portable platforms. This paper aims to address these challenges and promote the integration of deep object detectors in drone applications. We present a novel interpretative framework, DetDSHAP, designed to elucidate the predictions generated by the YOLOv5 detector. Furthermore, we propose utilising the contribution scores derived from our explanatory model as an innovative pruning technique for the YOLOv5 network, thereby achieving enhanced performance while minimising computational demands. Lastly, we provide performance evaluations of our approach demonstrating its efficiency across various datasets, including real data collected from drone-mounted cameras and established public benchmark datasets.</p>\",\"PeriodicalId\":50377,\"journal\":{\"name\":\"Iet Radar Sonar and Navigation\",\"volume\":\"19 1\",\"pages\":\"\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2025-06-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://onlinelibrary.wiley.com/doi/epdf/10.1049/rsn2.70042\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Iet Radar Sonar and Navigation\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1049/rsn2.70042\",\"RegionNum\":4,\"RegionCategory\":\"管理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"ENGINEERING, ELECTRICAL & ELECTRONIC\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Iet Radar Sonar and Navigation","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1049/rsn2.70042","RegionNum":4,"RegionCategory":"管理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
DetDSHAP: Explainable Object Detection for Uncrewed and Autonomous Drones With Shapley Values
Automatic object detection onboard drones is essential for facilitating autonomous operations. The advent of deep learning techniques has significantly enhanced the efficacy of object detection and recognition systems. However, the implementation of deep networks in real-world operational settings for autonomous decision-making presents several challenges. A primary concern is the lack of transparency in deep learning algorithms, which renders their behaviour unreliable to both practitioners and the general public. Additionally, deep networks often require substantial computational resources, which may not be feasible for many compact portable platforms. This paper aims to address these challenges and promote the integration of deep object detectors in drone applications. We present a novel interpretative framework, DetDSHAP, designed to elucidate the predictions generated by the YOLOv5 detector. Furthermore, we propose utilising the contribution scores derived from our explanatory model as an innovative pruning technique for the YOLOv5 network, thereby achieving enhanced performance while minimising computational demands. Lastly, we provide performance evaluations of our approach demonstrating its efficiency across various datasets, including real data collected from drone-mounted cameras and established public benchmark datasets.
期刊介绍:
IET Radar, Sonar & Navigation covers the theory and practice of systems and signals for radar, sonar, radiolocation, navigation, and surveillance purposes, in aerospace and terrestrial applications.
Examples include advances in waveform design, clutter and detection, electronic warfare, adaptive array and superresolution methods, tracking algorithms, synthetic aperture, and target recognition techniques.