{"title":"Civilian target detection using hierarchical fusion","authors":"Balasubramanian Lakshminarayanan, H. Qi","doi":"10.1109/AIPR.2005.22","DOIUrl":null,"url":null,"abstract":"Automatic target recognition (ATR) is the process of aided or unaided target detection and recognition using data from different sensors. Fusion techniques are used to improve ATR since this reduces system dependence on a single sensor and increases noise tolerance. In this work, ATR is performed on civilian targets which are considered more difficult to classify than military targets. The dataset is provided by the Night Vision & Electronic Sensors Directorate (NVESD) and is collected using the sensor fusion testbed (SFTB) developed by Northrop Grumman Mission Systems. Stationary color and infrared cameras capture images of seven different vehicles at different orientations and distances. Targets include two sedans, two SUVs, two light trucks and a heavy truck. Fusion is performed at the event level and sensor level using temporal and behavior-knowledge-space (BKS) fusion respectively. It is shown that fusion provides better and robust classification compared to classification of individual frames without fusion. The classification experiment shows, on an average, mean classification rates of 65.0%, 70.1% and 77.7% for individual frame classification, temporal fusion and BKS fusion respectively. It is demonstrated that the classification accuracy increases as the level of fusion goes higher. By combining targets into cars, SUVs and light trucks and thereby reducing the number of classes to three, higher mean classification rates of 75.4%, 90.0% and 94.8% were obtained","PeriodicalId":130204,"journal":{"name":"34th Applied Imagery and Pattern Recognition Workshop (AIPR'05)","volume":"77 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2005-10-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"34th Applied Imagery and Pattern Recognition Workshop (AIPR'05)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIPR.2005.22","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Automatic target recognition (ATR) is the process of aided or unaided target detection and recognition using data from different sensors. Fusion techniques are used to improve ATR since this reduces system dependence on a single sensor and increases noise tolerance. In this work, ATR is performed on civilian targets which are considered more difficult to classify than military targets. The dataset is provided by the Night Vision & Electronic Sensors Directorate (NVESD) and is collected using the sensor fusion testbed (SFTB) developed by Northrop Grumman Mission Systems. Stationary color and infrared cameras capture images of seven different vehicles at different orientations and distances. Targets include two sedans, two SUVs, two light trucks and a heavy truck. Fusion is performed at the event level and sensor level using temporal and behavior-knowledge-space (BKS) fusion respectively. It is shown that fusion provides better and robust classification compared to classification of individual frames without fusion. The classification experiment shows, on an average, mean classification rates of 65.0%, 70.1% and 77.7% for individual frame classification, temporal fusion and BKS fusion respectively. It is demonstrated that the classification accuracy increases as the level of fusion goes higher. By combining targets into cars, SUVs and light trucks and thereby reducing the number of classes to three, higher mean classification rates of 75.4%, 90.0% and 94.8% were obtained