{"title":"Feature selection using genetic algorithms for improving accuracy in image classification tasks","authors":"Andrei Dugaesescu, David-Traian Iancu","doi":"10.1109/ECAI58194.2023.10194193","DOIUrl":null,"url":null,"abstract":"Feature selection can be an effective tool for increasing the robustness and predictive accuracy of classifiers, especially in the presence of noisy features or when their dimensionality is high. Genetic algorithms (GA) lend themselves well for optimizing the search for the best subset of features. This paper present how GA can be integrated in the training of neural networks (NNs) as a feature selection step to increase the model performance. The reported experiments cover the effect such a technique can have when confronted with various sizes for the trained NN in the context of both harder and easier datasets. Moreover, the experimental setups make use of feature selection both as a traditional pre-processing step, before training the NN, as well as an intermediary processing layer between the features extractor part of a convolutional neural network (CNN), used in conjunction with more conventional statistical features, and the classification head. Although CNNs are known to inherently model the selection of features, meaning that the impact of a GA as a feature selector after the CNN backbone could be inhibited, marginal improvements in the final performance still show meaningful insight into the working of such a classifier, in the context of managing relevant features.","PeriodicalId":391483,"journal":{"name":"2023 15th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)","volume":"7 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 15th International Conference on Electronics, Computers and Artificial Intelligence (ECAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ECAI58194.2023.10194193","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Feature selection can be an effective tool for increasing the robustness and predictive accuracy of classifiers, especially in the presence of noisy features or when their dimensionality is high. Genetic algorithms (GA) lend themselves well for optimizing the search for the best subset of features. This paper present how GA can be integrated in the training of neural networks (NNs) as a feature selection step to increase the model performance. The reported experiments cover the effect such a technique can have when confronted with various sizes for the trained NN in the context of both harder and easier datasets. Moreover, the experimental setups make use of feature selection both as a traditional pre-processing step, before training the NN, as well as an intermediary processing layer between the features extractor part of a convolutional neural network (CNN), used in conjunction with more conventional statistical features, and the classification head. Although CNNs are known to inherently model the selection of features, meaning that the impact of a GA as a feature selector after the CNN backbone could be inhibited, marginal improvements in the final performance still show meaningful insight into the working of such a classifier, in the context of managing relevant features.