A. Heinecke, B. Peherstorfer, D. Pflüger, Zhongwen Song
{"title":"稀疏网格分类器作为AdaBoost的基础学习器","authors":"A. Heinecke, B. Peherstorfer, D. Pflüger, Zhongwen Song","doi":"10.1109/HPCSim.2012.6266906","DOIUrl":null,"url":null,"abstract":"We consider a classification method based on sparse grids which scales only linearly in the number of data points and is thus well-suited for huge amounts of data. In order to obtain competitive results, such sparse grid classifiers are usually enhanced by locally refining the underlying regular sparse grid. However, in order to parallelize the corresponding adaptive algorithms a thorough knowledge of the hardware is necessary. Instead of improving the performance by refining the sparse grid, we construct a team of classifiers relying just on regular sparse grids and employ them as base learners within AdaBoost. Our examples with synthetic and real-world datasets show that we can achieve similar or better results than with locally refined sparse grids or libSVM, with respect to both runtime and accuracy.","PeriodicalId":428764,"journal":{"name":"2012 International Conference on High Performance Computing & Simulation (HPCS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2012-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sparse grid classifiers as base learners for AdaBoost\",\"authors\":\"A. Heinecke, B. Peherstorfer, D. Pflüger, Zhongwen Song\",\"doi\":\"10.1109/HPCSim.2012.6266906\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We consider a classification method based on sparse grids which scales only linearly in the number of data points and is thus well-suited for huge amounts of data. In order to obtain competitive results, such sparse grid classifiers are usually enhanced by locally refining the underlying regular sparse grid. However, in order to parallelize the corresponding adaptive algorithms a thorough knowledge of the hardware is necessary. Instead of improving the performance by refining the sparse grid, we construct a team of classifiers relying just on regular sparse grids and employ them as base learners within AdaBoost. Our examples with synthetic and real-world datasets show that we can achieve similar or better results than with locally refined sparse grids or libSVM, with respect to both runtime and accuracy.\",\"PeriodicalId\":428764,\"journal\":{\"name\":\"2012 International Conference on High Performance Computing & Simulation (HPCS)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-07-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2012 International Conference on High Performance Computing & Simulation (HPCS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/HPCSim.2012.6266906\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2012 International Conference on High Performance Computing & Simulation (HPCS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/HPCSim.2012.6266906","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sparse grid classifiers as base learners for AdaBoost
We consider a classification method based on sparse grids which scales only linearly in the number of data points and is thus well-suited for huge amounts of data. In order to obtain competitive results, such sparse grid classifiers are usually enhanced by locally refining the underlying regular sparse grid. However, in order to parallelize the corresponding adaptive algorithms a thorough knowledge of the hardware is necessary. Instead of improving the performance by refining the sparse grid, we construct a team of classifiers relying just on regular sparse grids and employ them as base learners within AdaBoost. Our examples with synthetic and real-world datasets show that we can achieve similar or better results than with locally refined sparse grids or libSVM, with respect to both runtime and accuracy.