{"title":"基于多目标粒子群算法的最优支持向量机特征选择","authors":"I. Behravan, S. Zahiri, Oveis Dehghantanha","doi":"10.1155/2016/6305043","DOIUrl":null,"url":null,"abstract":"Support vector machine is a classifier, based on the structured risk minimization principle. The performance of the SVM, depends on different parameters such as: penalty factor, C, and the kernel factor, o. Also choosing an appropriate kernel function can improve the Recognition Score and lower the amount of computation. Furthermore, selecting the useful features among several features in dataset not only increases the performance of the SVM, but also reduces the computation time and complexity. So this is an optimization problem which can be solved by a heuristic algorithm. In some cases besides the Recognition Score, the Reliability of the classifier's output, is important. So in such cases a multi-objective optimization algorithm is needed. In this paper we have got the MOPSO algorithm to optimize the parameters of the SVM, choose appropriate kernel function and select the best features simultaneously in order to optimize the Recognition Score and the Reliability of the SVM. Nine different datasets, from UCI machine learning repository, are used to evaluate the power and the effectiveness of the proposed method (MOPSO-SVM). The results of the proposed method are compared to those which are achieved by RBF and MLP neural networks.","PeriodicalId":268101,"journal":{"name":"2016 1st Conference on Swarm Intelligence and Evolutionary Computation (CSIEC)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-03-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"21","resultStr":"{\"title\":\"An optimal SVM with feature selection using multi-objective PSO\",\"authors\":\"I. Behravan, S. Zahiri, Oveis Dehghantanha\",\"doi\":\"10.1155/2016/6305043\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Support vector machine is a classifier, based on the structured risk minimization principle. The performance of the SVM, depends on different parameters such as: penalty factor, C, and the kernel factor, o. Also choosing an appropriate kernel function can improve the Recognition Score and lower the amount of computation. Furthermore, selecting the useful features among several features in dataset not only increases the performance of the SVM, but also reduces the computation time and complexity. So this is an optimization problem which can be solved by a heuristic algorithm. In some cases besides the Recognition Score, the Reliability of the classifier's output, is important. So in such cases a multi-objective optimization algorithm is needed. In this paper we have got the MOPSO algorithm to optimize the parameters of the SVM, choose appropriate kernel function and select the best features simultaneously in order to optimize the Recognition Score and the Reliability of the SVM. Nine different datasets, from UCI machine learning repository, are used to evaluate the power and the effectiveness of the proposed method (MOPSO-SVM). The results of the proposed method are compared to those which are achieved by RBF and MLP neural networks.\",\"PeriodicalId\":268101,\"journal\":{\"name\":\"2016 1st Conference on Swarm Intelligence and Evolutionary Computation (CSIEC)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-03-09\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"21\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 1st Conference on Swarm Intelligence and Evolutionary Computation (CSIEC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1155/2016/6305043\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 1st Conference on Swarm Intelligence and Evolutionary Computation (CSIEC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1155/2016/6305043","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
An optimal SVM with feature selection using multi-objective PSO
Support vector machine is a classifier, based on the structured risk minimization principle. The performance of the SVM, depends on different parameters such as: penalty factor, C, and the kernel factor, o. Also choosing an appropriate kernel function can improve the Recognition Score and lower the amount of computation. Furthermore, selecting the useful features among several features in dataset not only increases the performance of the SVM, but also reduces the computation time and complexity. So this is an optimization problem which can be solved by a heuristic algorithm. In some cases besides the Recognition Score, the Reliability of the classifier's output, is important. So in such cases a multi-objective optimization algorithm is needed. In this paper we have got the MOPSO algorithm to optimize the parameters of the SVM, choose appropriate kernel function and select the best features simultaneously in order to optimize the Recognition Score and the Reliability of the SVM. Nine different datasets, from UCI machine learning repository, are used to evaluate the power and the effectiveness of the proposed method (MOPSO-SVM). The results of the proposed method are compared to those which are achieved by RBF and MLP neural networks.