利用光谱非混合和融合技术对多时高光谱图像进行土壤分类

IF 1.4 4区 地球科学 Q4 ENVIRONMENTAL SCIENCES
Eylem Kaba, U. Leloglu
{"title":"利用光谱非混合和融合技术对多时高光谱图像进行土壤分类","authors":"Eylem Kaba, U. Leloglu","doi":"10.1117/1.JRS.17.044513","DOIUrl":null,"url":null,"abstract":"Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.","PeriodicalId":54879,"journal":{"name":"Journal of Applied Remote Sensing","volume":"38 1","pages":"044513 - 044513"},"PeriodicalIF":1.4000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Soil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion\",\"authors\":\"Eylem Kaba, U. Leloglu\",\"doi\":\"10.1117/1.JRS.17.044513\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.\",\"PeriodicalId\":54879,\"journal\":{\"name\":\"Journal of Applied Remote Sensing\",\"volume\":\"38 1\",\"pages\":\"044513 - 044513\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1117/1.JRS.17.044513\",\"RegionNum\":4,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1117/1.JRS.17.044513","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

摘要土壤图是各种农业和环境研究的重要依据,因此,利用遥感技术探测土壤特性是一个热门话题。携带高光谱传感器的卫星为估算土壤特性提供了可能。但是,利用遥感方法进行土壤分类的主要障碍是植被,因为植被的光谱特征与土壤的光谱特征相混合。本研究的目的是利用高光谱成像数据消除植被的影响,并通过融合降低噪声,从而检测土壤质地特性。首先,确定所有图像共有的内含物及其丰度。然后将内含物分为稳定内含物(土壤、岩石等)和不稳定内含物(绿色植被、干燥植被等)。该方法利用正交子空间投影(OSP)消除图像中的植被,并利用加权平均值融合多幅图像,以获得更好的信噪比。最后,对融合后的图像进行分类,以获得土壤图。该方法在美国德克萨斯州一个地区的合成图像和 hyperion 高光谱图像上进行了测试。在三幅合成图像中,单幅图像的分类结果分别为 89.14%、89.81% 和 93.79%。经过 OSP 处理后,分类率分别提高到 92.23%、93.13% 和 95.38%,而经过融合处理后,分类率则提高到 96.97%。对于 2013 年 6 月 22 日、9 月 25 日和 10 月 24 日的真实图像,分类准确率分别从 70.51%、68.87% 和 63.18% 提高到 71.96%、71.78% 和 64.17%。融合后的分类准确率提高了 75.27%。对 2016 年真实图像的分析结果也有类似的改进。分类准确率分别从 57.07%、62.81% 和 63.80% 提高到 58.99%、63.93% 和 66.33%。在该实验中,融合还提供了 69.02% 的较高分类准确率。结果表明,该方法可以通过消除植被和融合多幅图像来提高分类准确率。该方法前景广阔,可应用于其他各种分类任务。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Soil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion
Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Journal of Applied Remote Sensing
Journal of Applied Remote Sensing 环境科学-成像科学与照相技术
CiteScore
3.40
自引率
11.80%
发文量
194
审稿时长
3 months
期刊介绍: The Journal of Applied Remote Sensing is a peer-reviewed journal that optimizes the communication of concepts, information, and progress among the remote sensing community.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信