{"title":"基于粒子群算法和决策理论粗糙集模型的特征选择","authors":"Aneta Stevanovic, Bing Xue, Mengjie Zhang","doi":"10.1109/CEC.2013.6557914","DOIUrl":null,"url":null,"abstract":"In this paper, we propose two new methods for feature selection based on particle swarm optimisation and a probabilistic rough set model called decision-theoretic rough set (DTRS). The first method uses rule degradation and cost properties of DTRS in the fitness function. This method focuses on the quality of the selected feature subset as a whole. The second method extends the first one by adding the individual feature confidence to the fitness function, which measures the quality of each feature in the subset. Three learning algorithms are employed to evaluate the classification performance of the proposed methods. The experiments are run on six commonly used datasets of varying difficulty. The results show that both methods can achieve good feature reduction rates with similar or better classification performance. Both methods can outperform two traditional feature selection methods. The second proposed method outperforms the first one in terms of the feature reduction rates while being able to maintaining similar or better classification rates.","PeriodicalId":211988,"journal":{"name":"2013 IEEE Congress on Evolutionary Computation","volume":"70 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Feature selection based on PSO and decision-theoretic rough set model\",\"authors\":\"Aneta Stevanovic, Bing Xue, Mengjie Zhang\",\"doi\":\"10.1109/CEC.2013.6557914\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, we propose two new methods for feature selection based on particle swarm optimisation and a probabilistic rough set model called decision-theoretic rough set (DTRS). The first method uses rule degradation and cost properties of DTRS in the fitness function. This method focuses on the quality of the selected feature subset as a whole. The second method extends the first one by adding the individual feature confidence to the fitness function, which measures the quality of each feature in the subset. Three learning algorithms are employed to evaluate the classification performance of the proposed methods. The experiments are run on six commonly used datasets of varying difficulty. The results show that both methods can achieve good feature reduction rates with similar or better classification performance. Both methods can outperform two traditional feature selection methods. The second proposed method outperforms the first one in terms of the feature reduction rates while being able to maintaining similar or better classification rates.\",\"PeriodicalId\":211988,\"journal\":{\"name\":\"2013 IEEE Congress on Evolutionary Computation\",\"volume\":\"70 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE Congress on Evolutionary Computation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEC.2013.6557914\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Congress on Evolutionary Computation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2013.6557914","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Feature selection based on PSO and decision-theoretic rough set model
In this paper, we propose two new methods for feature selection based on particle swarm optimisation and a probabilistic rough set model called decision-theoretic rough set (DTRS). The first method uses rule degradation and cost properties of DTRS in the fitness function. This method focuses on the quality of the selected feature subset as a whole. The second method extends the first one by adding the individual feature confidence to the fitness function, which measures the quality of each feature in the subset. Three learning algorithms are employed to evaluate the classification performance of the proposed methods. The experiments are run on six commonly used datasets of varying difficulty. The results show that both methods can achieve good feature reduction rates with similar or better classification performance. Both methods can outperform two traditional feature selection methods. The second proposed method outperforms the first one in terms of the feature reduction rates while being able to maintaining similar or better classification rates.