P. Lichodzijewski, M. Heywood, A. N. Zincir-Heywood
{"title":"用于数据挖掘的级联GP模型","authors":"P. Lichodzijewski, M. Heywood, A. N. Zincir-Heywood","doi":"10.1109/CEC.2004.1331178","DOIUrl":null,"url":null,"abstract":"The cascade architecture for incremental learning is demonstrated within the context of genetic programming. Such a scheme provides the basis for building steadily more complex models until a desired degree of accuracy is reached. The architecture is demonstrated for several data mining datasets. Efficient training on standard computing platforms is retained using the RSS-DSS algorithm for stochastically sampling datasets in proportion to exemplar 'difficulty' and 'age'. Finally, the ensuing empirical study provides the basis for recommending the utility of sum square cost functions in the datasets considered.","PeriodicalId":152088,"journal":{"name":"Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)","volume":"40 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-06-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Cascaded GP models for data mining\",\"authors\":\"P. Lichodzijewski, M. Heywood, A. N. Zincir-Heywood\",\"doi\":\"10.1109/CEC.2004.1331178\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The cascade architecture for incremental learning is demonstrated within the context of genetic programming. Such a scheme provides the basis for building steadily more complex models until a desired degree of accuracy is reached. The architecture is demonstrated for several data mining datasets. Efficient training on standard computing platforms is retained using the RSS-DSS algorithm for stochastically sampling datasets in proportion to exemplar 'difficulty' and 'age'. Finally, the ensuing empirical study provides the basis for recommending the utility of sum square cost functions in the datasets considered.\",\"PeriodicalId\":152088,\"journal\":{\"name\":\"Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)\",\"volume\":\"40 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-06-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CEC.2004.1331178\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2004 Congress on Evolutionary Computation (IEEE Cat. No.04TH8753)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CEC.2004.1331178","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
The cascade architecture for incremental learning is demonstrated within the context of genetic programming. Such a scheme provides the basis for building steadily more complex models until a desired degree of accuracy is reached. The architecture is demonstrated for several data mining datasets. Efficient training on standard computing platforms is retained using the RSS-DSS algorithm for stochastically sampling datasets in proportion to exemplar 'difficulty' and 'age'. Finally, the ensuing empirical study provides the basis for recommending the utility of sum square cost functions in the datasets considered.