{"title":"A case study on data fusion with overlapping segments","authors":"A. Cloninger, W. Czaja, T. Doster","doi":"10.1109/AIPR.2013.6749316","DOIUrl":null,"url":null,"abstract":"With the continual improvement and diversification of existing sensing modalities and the emergence of new sensing technologies, methods to effectively and efficiently fuse the diverse and heterogeneous data sets are increasingly important. When different sensors acquire data over the same region of the Earth, a direct comparison between pixels acquired from one sensor to pixels acquired from a another sensor becomes difficult. For example, there could be different number of bands, or the sensors could measure drastically different spaces (hyperspectral and LIDAR). A solution to this problem is Feature Space Rotation, which realizes the sensor data independently in separate feature spaces via a machine learning algorithm and then a rotation is learned to bring the separate feature spaces into a common feature space. This rotation, in it original form, requires some amount of overlap between the data sets. We propose a study to determine the effect of decreasing the amount of overlap between the two sensors has on the classification accuracy. For this study, we shall rely on hyperspectral data that has been simulated to come from two disjoint sensors.","PeriodicalId":435620,"journal":{"name":"2013 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","volume":"14 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Applied Imagery Pattern Recognition Workshop (AIPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/AIPR.2013.6749316","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3
Abstract
With the continual improvement and diversification of existing sensing modalities and the emergence of new sensing technologies, methods to effectively and efficiently fuse the diverse and heterogeneous data sets are increasingly important. When different sensors acquire data over the same region of the Earth, a direct comparison between pixels acquired from one sensor to pixels acquired from a another sensor becomes difficult. For example, there could be different number of bands, or the sensors could measure drastically different spaces (hyperspectral and LIDAR). A solution to this problem is Feature Space Rotation, which realizes the sensor data independently in separate feature spaces via a machine learning algorithm and then a rotation is learned to bring the separate feature spaces into a common feature space. This rotation, in it original form, requires some amount of overlap between the data sets. We propose a study to determine the effect of decreasing the amount of overlap between the two sensors has on the classification accuracy. For this study, we shall rely on hyperspectral data that has been simulated to come from two disjoint sensors.
随着现有传感方式的不断改进和多样化,以及新的传感技术的出现,有效和高效地融合多样化和异构数据集的方法变得越来越重要。当不同的传感器在地球的同一区域获取数据时,从一个传感器获取的像素与从另一个传感器获取的像素之间的直接比较变得困难。例如,可以有不同数量的波段,或者传感器可以测量完全不同的空间(高光谱和激光雷达)。Feature Space Rotation是解决这一问题的一种方法,它通过机器学习算法将传感器数据独立地放在单独的特征空间中,然后通过学习旋转将这些单独的特征空间转化为一个公共的特征空间。这种旋转,在其原始形式中,需要数据集之间有一定程度的重叠。我们提出了一项研究,以确定减少两个传感器之间的重叠量对分类精度的影响。对于这项研究,我们将依赖于已经模拟的来自两个不相交传感器的高光谱数据。