{"title":"T. Anderson判别函数的逼近与类后验概率的估计。逼近法的收敛性","authors":"V. Zenkov","doi":"10.1109/MLSD.2018.8551879","DOIUrl":null,"url":null,"abstract":"Discriminant function in T. Anderson’s definition is a function of regression in feature space. The training set in supervised learning is converted into a set of regression analysis by replacing class numbers with the differences of the corresponding costs of classification errors. The posterior probabilities of classes at points on the boundary between them depend only on the costs of classification errors. This is the basis for the method of obtaining estimates of a posterior probability of classes. It does not require adaptations to discriminant functions such as, for example, the Platt’s calibrator. For the heuristic method of approximation of the discriminant function in the range of zero values, the convergence conditions of the algorithm are obtained with increasing the volume of the training set and the length of the iterative process.","PeriodicalId":158352,"journal":{"name":"2018 Eleventh International Conference \"Management of large-scale system development\" (MLSD","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Approximation of the T. Anderson’s Discriminant Function and Estimation of the Posterior Probabilities of Classes. Convergence of the Approximation Method\",\"authors\":\"V. Zenkov\",\"doi\":\"10.1109/MLSD.2018.8551879\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Discriminant function in T. Anderson’s definition is a function of regression in feature space. The training set in supervised learning is converted into a set of regression analysis by replacing class numbers with the differences of the corresponding costs of classification errors. The posterior probabilities of classes at points on the boundary between them depend only on the costs of classification errors. This is the basis for the method of obtaining estimates of a posterior probability of classes. It does not require adaptations to discriminant functions such as, for example, the Platt’s calibrator. For the heuristic method of approximation of the discriminant function in the range of zero values, the convergence conditions of the algorithm are obtained with increasing the volume of the training set and the length of the iterative process.\",\"PeriodicalId\":158352,\"journal\":{\"name\":\"2018 Eleventh International Conference \\\"Management of large-scale system development\\\" (MLSD\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2018 Eleventh International Conference \\\"Management of large-scale system development\\\" (MLSD\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/MLSD.2018.8551879\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2018 Eleventh International Conference \"Management of large-scale system development\" (MLSD","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/MLSD.2018.8551879","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Approximation of the T. Anderson’s Discriminant Function and Estimation of the Posterior Probabilities of Classes. Convergence of the Approximation Method
Discriminant function in T. Anderson’s definition is a function of regression in feature space. The training set in supervised learning is converted into a set of regression analysis by replacing class numbers with the differences of the corresponding costs of classification errors. The posterior probabilities of classes at points on the boundary between them depend only on the costs of classification errors. This is the basis for the method of obtaining estimates of a posterior probability of classes. It does not require adaptations to discriminant functions such as, for example, the Platt’s calibrator. For the heuristic method of approximation of the discriminant function in the range of zero values, the convergence conditions of the algorithm are obtained with increasing the volume of the training set and the length of the iterative process.