{"title":"利用多目标优化构建非随机森林","authors":"Joanna Klikowska, Michał Woźniak","doi":"10.1093/jigpal/jzae110","DOIUrl":null,"url":null,"abstract":"The use of multi-objective optimization to build classifier ensembles is becoming increasingly popular. This approach optimizes more than one criterion simultaneously and returns a set of solutions. Thus the final solution can be more tailored to the user’s needs. The work proposes the MOONF method using one or two criteria depending on the method’s version. Optimization returns solutions as feature subspaces that are then used to train decision tree models. In this way, the ensemble is created non-randomly, unlike the popular Random Subspace approach (such as the Random Forest classifier). Experiments carried out on many imbalanced datasets compare the proposed methods with state-of-the-art methods and show the advantage of the MOONF method in the multi-objective version.","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Using Multi-Objective Optimization to build non-Random Forest\",\"authors\":\"Joanna Klikowska, Michał Woźniak\",\"doi\":\"10.1093/jigpal/jzae110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The use of multi-objective optimization to build classifier ensembles is becoming increasingly popular. This approach optimizes more than one criterion simultaneously and returns a set of solutions. Thus the final solution can be more tailored to the user’s needs. The work proposes the MOONF method using one or two criteria depending on the method’s version. Optimization returns solutions as feature subspaces that are then used to train decision tree models. In this way, the ensemble is created non-randomly, unlike the popular Random Subspace approach (such as the Random Forest classifier). Experiments carried out on many imbalanced datasets compare the proposed methods with state-of-the-art methods and show the advantage of the MOONF method in the multi-objective version.\",\"PeriodicalId\":0,\"journal\":{\"name\":\"\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0,\"publicationDate\":\"2024-09-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1093/jigpal/jzae110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1093/jigpal/jzae110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Using Multi-Objective Optimization to build non-Random Forest
The use of multi-objective optimization to build classifier ensembles is becoming increasingly popular. This approach optimizes more than one criterion simultaneously and returns a set of solutions. Thus the final solution can be more tailored to the user’s needs. The work proposes the MOONF method using one or two criteria depending on the method’s version. Optimization returns solutions as feature subspaces that are then used to train decision tree models. In this way, the ensemble is created non-randomly, unlike the popular Random Subspace approach (such as the Random Forest classifier). Experiments carried out on many imbalanced datasets compare the proposed methods with state-of-the-art methods and show the advantage of the MOONF method in the multi-objective version.