P. Steele, P. Perconti
{"title":"部分任务研究了基于灰度和合成彩色夜视传感器图像的直升机引航多光谱图像融合","authors":"P. Steele, P. Perconti","doi":"10.1117/12.276665","DOIUrl":null,"url":null,"abstract":"Today, night vision sensor and display systems used in the pilotage or navigation of military helicopters are either long wave IR thermal sensors (8 - 12 microns) or image intensified, visible and near IR (0.6 - 0.9 microns), sensors. The sensor imagery is displayed using a monochrome phosphor on a Cathode Ray Tube or night vision goggle. Currently, there is no fielded capability to combine the best attributes of the emissive radiation sensed by the thermal sensor and the reflected radiation sensed by the image intensified sensor into a single fused image. However, recent advances in signal processing have permitted the real time image fusion and display of multispectral sensors in either monochrome or synthetic chromatic form. The merits of such signal processing is explored. A part task simulation using a desktop computer, video playback unit, and a biocular head mounted display was conducted. Response time and accuracy measures of test subject responses to visual perception tasks were taken. Subjective ratings were collected to determine levels of pilot acceptance. In general, fusion based formats resulted in better subject performance. The benefits of integrating synthetic color to fused imagery, however, is dependent on the color algorithm used, the visual task performed, and scene content.© (1997) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.","PeriodicalId":417187,"journal":{"name":"Storage and Retrieval for Image and Video Databases","volume":"75 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1997-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"57","resultStr":"{\"title\":\"Part task investigation of multispectral image fusion using gray scale and synthetic color night vision sensor imagery for helicopter pilotage\",\"authors\":\"P. Steele, P. Perconti\",\"doi\":\"10.1117/12.276665\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Today, night vision sensor and display systems used in the pilotage or navigation of military helicopters are either long wave IR thermal sensors (8 - 12 microns) or image intensified, visible and near IR (0.6 - 0.9 microns), sensors. The sensor imagery is displayed using a monochrome phosphor on a Cathode Ray Tube or night vision goggle. Currently, there is no fielded capability to combine the best attributes of the emissive radiation sensed by the thermal sensor and the reflected radiation sensed by the image intensified sensor into a single fused image. However, recent advances in signal processing have permitted the real time image fusion and display of multispectral sensors in either monochrome or synthetic chromatic form. The merits of such signal processing is explored. A part task simulation using a desktop computer, video playback unit, and a biocular head mounted display was conducted. Response time and accuracy measures of test subject responses to visual perception tasks were taken. Subjective ratings were collected to determine levels of pilot acceptance. In general, fusion based formats resulted in better subject performance. The benefits of integrating synthetic color to fused imagery, however, is dependent on the color algorithm used, the visual task performed, and scene content.© (1997) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.\",\"PeriodicalId\":417187,\"journal\":{\"name\":\"Storage and Retrieval for Image and Video Databases\",\"volume\":\"75 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1997-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"57\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Storage and Retrieval for Image and Video Databases\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1117/12.276665\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Storage and Retrieval for Image and Video Databases","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1117/12.276665","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 57
Part task investigation of multispectral image fusion using gray scale and synthetic color night vision sensor imagery for helicopter pilotage
Today, night vision sensor and display systems used in the pilotage or navigation of military helicopters are either long wave IR thermal sensors (8 - 12 microns) or image intensified, visible and near IR (0.6 - 0.9 microns), sensors. The sensor imagery is displayed using a monochrome phosphor on a Cathode Ray Tube or night vision goggle. Currently, there is no fielded capability to combine the best attributes of the emissive radiation sensed by the thermal sensor and the reflected radiation sensed by the image intensified sensor into a single fused image. However, recent advances in signal processing have permitted the real time image fusion and display of multispectral sensors in either monochrome or synthetic chromatic form. The merits of such signal processing is explored. A part task simulation using a desktop computer, video playback unit, and a biocular head mounted display was conducted. Response time and accuracy measures of test subject responses to visual perception tasks were taken. Subjective ratings were collected to determine levels of pilot acceptance. In general, fusion based formats resulted in better subject performance. The benefits of integrating synthetic color to fused imagery, however, is dependent on the color algorithm used, the visual task performed, and scene content.© (1997) COPYRIGHT SPIE--The International Society for Optical Engineering. Downloading of the abstract is permitted for personal use only.