David R Ha, Hideyuki Watanabe, Yuya Tomotoshi, Emilie Delattre, S. Katagiri
{"title":"Optimality Analysis of Boundary-Uncertainty-Based Classifier Model Parameter Status Selection Method","authors":"David R Ha, Hideyuki Watanabe, Yuya Tomotoshi, Emilie Delattre, S. Katagiri","doi":"10.1145/3297067.3297076","DOIUrl":null,"url":null,"abstract":"We proposed a novel method that selects an optimal classifier model's parameter status through the uncertainty measure evaluation of the estimated class boundaries instead of an estimation of the classification error probability. A key feature of our method is its potential to perform a classifier parameter status selection without a separate validation sample set that can be easily applied to any reasonable type of classifier model, unlike traditional approaches that often need a validation sample set or are sometimes less practical. In this paper, we first summarize our method and its experimental evaluation results and introduce the mathematical formalization for the posterior probability estimation procedure adopted in it. Then we show the convergence property of the estimation procedure and finally demonstrate our method's optimality in a practical situation where only a finite number of training samples are available.","PeriodicalId":340004,"journal":{"name":"International Conference on Signal Processing and Machine Learning","volume":"171 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-11-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Conference on Signal Processing and Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3297067.3297076","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
We proposed a novel method that selects an optimal classifier model's parameter status through the uncertainty measure evaluation of the estimated class boundaries instead of an estimation of the classification error probability. A key feature of our method is its potential to perform a classifier parameter status selection without a separate validation sample set that can be easily applied to any reasonable type of classifier model, unlike traditional approaches that often need a validation sample set or are sometimes less practical. In this paper, we first summarize our method and its experimental evaluation results and introduce the mathematical formalization for the posterior probability estimation procedure adopted in it. Then we show the convergence property of the estimation procedure and finally demonstrate our method's optimality in a practical situation where only a finite number of training samples are available.