{"title":"利用光谱非混合和融合技术对多时高光谱图像进行土壤分类","authors":"Eylem Kaba, U. Leloglu","doi":"10.1117/1.JRS.17.044513","DOIUrl":null,"url":null,"abstract":"Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.","PeriodicalId":54879,"journal":{"name":"Journal of Applied Remote Sensing","volume":"38 1","pages":"044513 - 044513"},"PeriodicalIF":1.4000,"publicationDate":"2023-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Soil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion\",\"authors\":\"Eylem Kaba, U. Leloglu\",\"doi\":\"10.1117/1.JRS.17.044513\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.\",\"PeriodicalId\":54879,\"journal\":{\"name\":\"Journal of Applied Remote Sensing\",\"volume\":\"38 1\",\"pages\":\"044513 - 044513\"},\"PeriodicalIF\":1.4000,\"publicationDate\":\"2023-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Remote Sensing\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://doi.org/10.1117/1.JRS.17.044513\",\"RegionNum\":4,\"RegionCategory\":\"地球科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"ENVIRONMENTAL SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Remote Sensing","FirstCategoryId":"5","ListUrlMain":"https://doi.org/10.1117/1.JRS.17.044513","RegionNum":4,"RegionCategory":"地球科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"ENVIRONMENTAL SCIENCES","Score":null,"Total":0}
Soil classification with multi-temporal hyperspectral imagery using spectral unmixing and fusion
Abstract. Soil maps are essential sources for a diverse range of agricultural and environmental studies; hence, the detection of soil properties using remote sensing technology is a hot topic. Satellites carrying hyperspectral sensors provide possibilities for the estimation of soil properties. But, the main obstacle in soil classification with remote sensing methods is the vegetation that has a spectral signature that mixes with that of the soil. The objective of this study is to detect soil texture properties after eliminating the effects of vegetation using hyperspectral imaging data and reducing the noise by fusion. First, the endmembers common to all images and their abundances are determined. Then the endmembers are classified as stable ones (soil, rock, etc.) and unstable ones (green vegetation, dry vegetation, etc.). This method eliminates vegetation from the images with orthogonal subspace projection (OSP) and fuses multiple images with the weighted mean for a better signal-to-noise-ratio. Finally, the fused image is classified to obtain the soil maps. The method is tested on synthetic images and hyperion hyperspectral images of an area in Texas, United States. With three synthetic images, the individual classification results are 89.14%, 89.81%, and 93.79%. After OSP, the rates increase to 92.23%, 93.13%, and 95.38%, respectively, whereas it increases to 96.97% with fusion. With real images from the dates 22/06/2013, 25/09/2013, and 24/10/2013, the classification accuracies increase from 70.51%, 68.87%, and 63.18% to 71.96%, 71.78%, and 64.17%, respectively. Fusion provides a better improvement in classification with a 75.27% accuracy. The results for the analysis of the real images from 2016 yield similar improvements. The classification accuracies increase from 57.07%, 62.81%, and 63.80% to 58.99%, 63.93%, and 66.33%, respectively. Fusion also provides a better classification accuracy of 69.02% for this experiment. The results show that the method can improve the classification accuracy with the elimination of vegetation and with the fusion of multiple images. The approach is promising and can be applied to various other classification tasks.
期刊介绍:
The Journal of Applied Remote Sensing is a peer-reviewed journal that optimizes the communication of concepts, information, and progress among the remote sensing community.