{"title":"Simulated annealing based classification","authors":"S. Finnerty, S. Sen","doi":"10.1109/TAI.1994.346392","DOIUrl":null,"url":null,"abstract":"Attribute based classification has been one of the most active areas of machine learning research over the past decade. We view the problem of hypotheses formation for classification as a search problem. Whereas previous research acquiring classification knowledge have used a deterministic bias for forming generalizations, we use a more random bias for taking inductive leaps. We re-formulate the supervised classification problem as a function optimization problem, the goal of which is to search for a hypotheses that minimizes the number of incorrect classifications of training instances. We use a simulated annealing based classifier (SAC) to optimize the hypotheses used for classification. The particular variation of simulated annealing algorithm that we have used is known as Very Fast Simulated Re-annealing (VFSR). We use a batch-incremental mode of learning to compare SAC with a genetic algorithm based classifier, GABIL, and a traditional incremental machine learning algorithm, ID5R. By using a set of artificial target concepts, we show that SAC performs better on more complex target concepts.<<ETX>>","PeriodicalId":262014,"journal":{"name":"Proceedings Sixth International Conference on Tools with Artificial Intelligence. TAI 94","volume":"69 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Sixth International Conference on Tools with Artificial Intelligence. TAI 94","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/TAI.1994.346392","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4
Abstract
Attribute based classification has been one of the most active areas of machine learning research over the past decade. We view the problem of hypotheses formation for classification as a search problem. Whereas previous research acquiring classification knowledge have used a deterministic bias for forming generalizations, we use a more random bias for taking inductive leaps. We re-formulate the supervised classification problem as a function optimization problem, the goal of which is to search for a hypotheses that minimizes the number of incorrect classifications of training instances. We use a simulated annealing based classifier (SAC) to optimize the hypotheses used for classification. The particular variation of simulated annealing algorithm that we have used is known as Very Fast Simulated Re-annealing (VFSR). We use a batch-incremental mode of learning to compare SAC with a genetic algorithm based classifier, GABIL, and a traditional incremental machine learning algorithm, ID5R. By using a set of artificial target concepts, we show that SAC performs better on more complex target concepts.<>