Rully Soelaiman, Dommy Asfiandy, Yudhi Purwananto, M. Purnomo
{"title":"Weighted kernel function implementation for hyperspectral image classification based on Support Vector Machine","authors":"Rully Soelaiman, Dommy Asfiandy, Yudhi Purwananto, M. Purnomo","doi":"10.1109/ICICI-BME.2009.5417293","DOIUrl":null,"url":null,"abstract":"Hyperspectral image has many spectral bands, each of them represent different range of frequency. Each spectral band has different characteristics according to the reflection level captured by hyperspectral remote sensor. These characteristics can distinguish a class with another classes in the hyperspecral image classification, however in some spectral bands these characteristics are not unique so these classes can't be separated perfectly. There are several factors that cause this to happen, e.g. atmospheric effects that can disrupt the reflection captured by the sensor and the natural similarity of several classes. So not all spectral bands have enough information that can separates each class in these spectral bands. Information that not evenly distributed throughout the spectral bands needed a weighting scheme that can provide suitable proportion of each spectral band in the hyperspectral image classification. This paper proposes method that implemented weighting scheme as an embedded feature selection on hyperspectral dataset provided by AVIRIS imaging spectometer. Three methods used to estimate spectral weight i.e. Gradient Descent, Mutual Information, and Bhattacharyya Distance. These 3 methods are integrated into SVM learning procedure through its kernel function, called weighted kernel, as weight estimator. From the experiments later in this paper, it can be seen that Gradient Descent outperforms two other methods but takes much time to be executed because it uses many iterations to achieve good performance.","PeriodicalId":191194,"journal":{"name":"International Conference on Instrumentation, Communication, Information Technology, and Biomedical Engineering 2009","volume":"8 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Instrumentation, Communication, Information Technology, and Biomedical Engineering 2009","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICICI-BME.2009.5417293","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
Hyperspectral image has many spectral bands, each of them represent different range of frequency. Each spectral band has different characteristics according to the reflection level captured by hyperspectral remote sensor. These characteristics can distinguish a class with another classes in the hyperspecral image classification, however in some spectral bands these characteristics are not unique so these classes can't be separated perfectly. There are several factors that cause this to happen, e.g. atmospheric effects that can disrupt the reflection captured by the sensor and the natural similarity of several classes. So not all spectral bands have enough information that can separates each class in these spectral bands. Information that not evenly distributed throughout the spectral bands needed a weighting scheme that can provide suitable proportion of each spectral band in the hyperspectral image classification. This paper proposes method that implemented weighting scheme as an embedded feature selection on hyperspectral dataset provided by AVIRIS imaging spectometer. Three methods used to estimate spectral weight i.e. Gradient Descent, Mutual Information, and Bhattacharyya Distance. These 3 methods are integrated into SVM learning procedure through its kernel function, called weighted kernel, as weight estimator. From the experiments later in this paper, it can be seen that Gradient Descent outperforms two other methods but takes much time to be executed because it uses many iterations to achieve good performance.