{"title":"具有平稳混合输入序列的稀疏正则化支持向量机","authors":"Yi Ding, Yi Tang","doi":"10.1109/ICWAPR.2010.5576330","DOIUrl":null,"url":null,"abstract":"It has been shown that a sparse target can be well learned by the l1-regularized learning methods when samples are independent and identically distributed (i.i.d.). In this paper we go far beyond this classical framework by bounding the generalization errors and excess risks of l1-regularized support vector machine(l1-svm) for stationary β-mixing observations. Utilizing a technique introduced by [1] that constructs a sequence of independent blocks close in distribution to the original samples, such bounds are developed by Rademacher average technique. The results replied partly an open question in [2] of wether Rademacher average technique can be extended to deal with dependent status.","PeriodicalId":219884,"journal":{"name":"2010 International Conference on Wavelet Analysis and Pattern Recognition","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2010-07-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Sparsity-regularized support vector machine with stationary mixing input sequence\",\"authors\":\"Yi Ding, Yi Tang\",\"doi\":\"10.1109/ICWAPR.2010.5576330\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"It has been shown that a sparse target can be well learned by the l1-regularized learning methods when samples are independent and identically distributed (i.i.d.). In this paper we go far beyond this classical framework by bounding the generalization errors and excess risks of l1-regularized support vector machine(l1-svm) for stationary β-mixing observations. Utilizing a technique introduced by [1] that constructs a sequence of independent blocks close in distribution to the original samples, such bounds are developed by Rademacher average technique. The results replied partly an open question in [2] of wether Rademacher average technique can be extended to deal with dependent status.\",\"PeriodicalId\":219884,\"journal\":{\"name\":\"2010 International Conference on Wavelet Analysis and Pattern Recognition\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2010-07-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2010 International Conference on Wavelet Analysis and Pattern Recognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICWAPR.2010.5576330\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2010 International Conference on Wavelet Analysis and Pattern Recognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICWAPR.2010.5576330","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Sparsity-regularized support vector machine with stationary mixing input sequence
It has been shown that a sparse target can be well learned by the l1-regularized learning methods when samples are independent and identically distributed (i.i.d.). In this paper we go far beyond this classical framework by bounding the generalization errors and excess risks of l1-regularized support vector machine(l1-svm) for stationary β-mixing observations. Utilizing a technique introduced by [1] that constructs a sequence of independent blocks close in distribution to the original samples, such bounds are developed by Rademacher average technique. The results replied partly an open question in [2] of wether Rademacher average technique can be extended to deal with dependent status.