Nobumichi Shutoh, Masashi Hyodo, T. Pavlenko, T. Seo
{"title":"基于单调缺失数据的学生化分类统计约束线性判别规则","authors":"Nobumichi Shutoh, Masashi Hyodo, T. Pavlenko, T. Seo","doi":"10.55937/sut/1345734342","DOIUrl":null,"url":null,"abstract":"This paper provides an asymptotic expansion for the distribution of the Studentized linear discriminant function with k-step monotone missing training data. It turns out to be a certain generalization of the results derived by Anderson [1] and Shutoh and Seo [12]. Furthermore we also derive the cutoff point constrained by a conditional probability of misclassification using the idea of McLachlan [8]. Finally we perform Monte Carlo simulation to evaluate our results.","PeriodicalId":38708,"journal":{"name":"SUT Journal of Mathematics","volume":"18 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2012-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Constrained linear discriminant rule via the Studentized classification statistic based on monotone missing data\",\"authors\":\"Nobumichi Shutoh, Masashi Hyodo, T. Pavlenko, T. Seo\",\"doi\":\"10.55937/sut/1345734342\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper provides an asymptotic expansion for the distribution of the Studentized linear discriminant function with k-step monotone missing training data. It turns out to be a certain generalization of the results derived by Anderson [1] and Shutoh and Seo [12]. Furthermore we also derive the cutoff point constrained by a conditional probability of misclassification using the idea of McLachlan [8]. Finally we perform Monte Carlo simulation to evaluate our results.\",\"PeriodicalId\":38708,\"journal\":{\"name\":\"SUT Journal of Mathematics\",\"volume\":\"18 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2012-01-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"SUT Journal of Mathematics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.55937/sut/1345734342\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"Mathematics\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"SUT Journal of Mathematics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.55937/sut/1345734342","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"Mathematics","Score":null,"Total":0}
引用次数: 1
摘要
本文给出了缺少k步单调训练数据的学生化线性判别函数分布的渐近展开式。这是对Anderson[1]和Shutoh and Seo[12]的结果的某种推广。此外,我们还利用McLachlan[8]的思想导出了受错误分类条件概率约束的截断点。最后进行了蒙特卡罗模拟来验证我们的结果。
Constrained linear discriminant rule via the Studentized classification statistic based on monotone missing data
This paper provides an asymptotic expansion for the distribution of the Studentized linear discriminant function with k-step monotone missing training data. It turns out to be a certain generalization of the results derived by Anderson [1] and Shutoh and Seo [12]. Furthermore we also derive the cutoff point constrained by a conditional probability of misclassification using the idea of McLachlan [8]. Finally we perform Monte Carlo simulation to evaluate our results.