Fusion of LIDAR data with hyperspectral and high-resolution imagery for automation of DIRSIG scene generation

Ryan N. Givens, K. Walli, M. Eismann
{"title":"Fusion of LIDAR data with hyperspectral and high-resolution imagery for automation of DIRSIG scene generation","authors":"Ryan N. Givens, K. Walli, M. Eismann","doi":"10.1109/AIPR.2012.6528191","DOIUrl":null,"url":null,"abstract":"Developing new remote sensing instruments is a costly and time consuming process. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model gives users the ability to create synthetic images for a proposed sensor before building it. However, to produce synthetic images, DIRSIG requires facetized, three-dimensional models attributed with spectral and texture information which can themselves be costly and time consuming to produce. Recent work by Walli has shown that coincident LIDAR data and high-resolution imagery can be registered and used to automatically generate the geometry and texture information needed for a DIRSIG scene. This method, called LIDAR Direct, greatly reduces the time and manpower needed to generate a scene, but still requires user interaction to attribute facets with either library or field measured spectral information. This paper builds upon that work and presents a method for autonomously generating the geometry, texture, and spectral content for a scene when coincident LIDAR data, high-resolution imagery, and HyperSpectral Imagery (HSI) of a site are available. Then the method is demonstrated on real data.","PeriodicalId":406942,"journal":{"name":"2012 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","volume":"36 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-10-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIPR.2012.6528191","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5

Abstract

Developing new remote sensing instruments is a costly and time consuming process. The Digital Imaging and Remote Sensing Image Generation (DIRSIG) model gives users the ability to create synthetic images for a proposed sensor before building it. However, to produce synthetic images, DIRSIG requires facetized, three-dimensional models attributed with spectral and texture information which can themselves be costly and time consuming to produce. Recent work by Walli has shown that coincident LIDAR data and high-resolution imagery can be registered and used to automatically generate the geometry and texture information needed for a DIRSIG scene. This method, called LIDAR Direct, greatly reduces the time and manpower needed to generate a scene, but still requires user interaction to attribute facets with either library or field measured spectral information. This paper builds upon that work and presents a method for autonomously generating the geometry, texture, and spectral content for a scene when coincident LIDAR data, high-resolution imagery, and HyperSpectral Imagery (HSI) of a site are available. Then the method is demonstrated on real data.
激光雷达数据与高光谱和高分辨率图像的融合,用于DIRSIG场景生成自动化
开发新的遥感仪器是一个昂贵和耗时的过程。数字成像和遥感图像生成(DIRSIG)模型使用户能够在构建提议的传感器之前为其创建合成图像。然而,为了生成合成图像,DIRSIG需要具有光谱和纹理信息的面化三维模型,而这些模型本身的制作成本高,耗时长。Walli最近的工作表明,激光雷达数据和高分辨率图像可以同步注册,并用于自动生成DIRSIG场景所需的几何和纹理信息。这种被称为LIDAR Direct的方法大大减少了生成场景所需的时间和人力,但仍然需要用户与库或现场测量的光谱信息进行交互。本文建立在该工作的基础上,并提出了一种方法,当同一地点的激光雷达数据、高分辨率图像和高光谱图像(HSI)可用时,自动生成场景的几何形状、纹理和光谱内容。并在实际数据上进行了验证。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信