{"title":"通过样本宽度量化学习的准确性","authors":"M. Anthony, Joel Ratsaby","doi":"10.1109/FOCI.2013.6602459","DOIUrl":null,"url":null,"abstract":"In a recent paper, the authors introduced the notion of sample width for binary classifiers defined on the set of real numbers. It was shown that the performance of such classifiers could be quantified in terms of this sample width. This paper considers how to adapt the idea of sample width so that it can be applied in cases where the classifiers are defined on some finite metric space. We discuss how to employ a greedy set-covering heuristic to bound generalization error. Then, by relating the learning problem to one involving certain graph-theoretic parameters, we obtain generalization error bounds that depend on the sample width and on measures of `density' of the underlying metric space.","PeriodicalId":237129,"journal":{"name":"2013 IEEE Symposium on Foundations of Computational Intelligence (FOCI)","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2013-04-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Quantifying accuracy of learning via sample width\",\"authors\":\"M. Anthony, Joel Ratsaby\",\"doi\":\"10.1109/FOCI.2013.6602459\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In a recent paper, the authors introduced the notion of sample width for binary classifiers defined on the set of real numbers. It was shown that the performance of such classifiers could be quantified in terms of this sample width. This paper considers how to adapt the idea of sample width so that it can be applied in cases where the classifiers are defined on some finite metric space. We discuss how to employ a greedy set-covering heuristic to bound generalization error. Then, by relating the learning problem to one involving certain graph-theoretic parameters, we obtain generalization error bounds that depend on the sample width and on measures of `density' of the underlying metric space.\",\"PeriodicalId\":237129,\"journal\":{\"name\":\"2013 IEEE Symposium on Foundations of Computational Intelligence (FOCI)\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2013-04-16\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2013 IEEE Symposium on Foundations of Computational Intelligence (FOCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FOCI.2013.6602459\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2013 IEEE Symposium on Foundations of Computational Intelligence (FOCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FOCI.2013.6602459","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
In a recent paper, the authors introduced the notion of sample width for binary classifiers defined on the set of real numbers. It was shown that the performance of such classifiers could be quantified in terms of this sample width. This paper considers how to adapt the idea of sample width so that it can be applied in cases where the classifiers are defined on some finite metric space. We discuss how to employ a greedy set-covering heuristic to bound generalization error. Then, by relating the learning problem to one involving certain graph-theoretic parameters, we obtain generalization error bounds that depend on the sample width and on measures of `density' of the underlying metric space.