{"title":"Fusion of LIDAR data with hyperspectral and high-resolution imagery for automation of DIRSIG scene generation","authors":"Ryan N. Givens, K. Walli, M. Eismann","doi":"10.1109/AIPR.2012.6528191","DOIUrl":null,"url":null,"abstract":"Developing new remote sensing instruments is a costly and time consuming process. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model gives users the ability to create synthetic images for a proposed sensor before building it. However, to produce synthetic images, DIRSIG requires facetized, three-dimensional models attributed with spectral and texture information which can themselves be costly and time consuming to produce. Recent work by Walli has shown that coincident LIDAR data and high-resolution imagery can be registered and used to automatically generate the geometry and texture information needed for a DIRSIG scene. This method, called LIDAR Direct, greatly reduces the time and manpower needed to generate a scene, but still requires user interaction to attribute facets with either library or field measured spectral information. This paper builds upon that work and presents a method for autonomously generating the geometry, texture, and spectral content for a scene when coincident LIDAR data, high-resolution imagery, and HyperSpectral Imagery (HSI) of a site are available. Then the method is demonstrated on real data.","PeriodicalId":406942,"journal":{"name":"2012 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIPR.2012.6528191","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Developing new remote sensing instruments is a costly and time consuming process. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model gives users the ability to create synthetic images for a proposed sensor before building it. However, to produce synthetic images, DIRSIG requires facetized, three-dimensional models attributed with spectral and texture information which can themselves be costly and time consuming to produce. Recent work by Walli has shown that coincident LIDAR data and high-resolution imagery can be registered and used to automatically generate the geometry and texture information needed for a DIRSIG scene. This method, called LIDAR Direct, greatly reduces the time and manpower needed to generate a scene, but still requires user interaction to attribute facets with either library or field measured spectral information. This paper builds upon that work and presents a method for autonomously generating the geometry, texture, and spectral content for a scene when coincident LIDAR data, high-resolution imagery, and HyperSpectral Imagery (HSI) of a site are available. Then the method is demonstrated on real data.