{"title":"Jackknife Estimator Consistency for Nonlinear Mixture","authors":"R. Maiboroda, Vitaliy MIroshnychenko","doi":"10.11159/icsta22.149","DOIUrl":null,"url":null,"abstract":"Extended Abstract This paper continues our studies of the jackknife (JK) technique application for estimation of estimators’ covariance matrices in models of mixture with varying concentrations (MVC) [2, 3]. On JK applications for homogeneous samples, see [1]. In MVC models one deals with a non-homogeneous sample, which consists of subjects belonging to 𝑀 different sub-populations (mixture components). One knows the probabilities with which a subject belongs to the mixture components and these probabilities are different for different subjects. Therefore, the considered observations are independent but not identically distributed. We consider objects from a mixture with various concentrations. All objects from the sample Ξ 𝑛 belongs to one of M different mixture components. Each object from the sample 𝛯 𝑛 = (𝜉 𝑗 ) 𝑗=1 𝑛 has observed characteristics 𝜉 𝑗 = (𝑋 𝑗 , 𝑌 𝑗 ) ∈ ℝ 𝐷 and one hidden 𝜅 𝑗 . 𝜅 𝑗 = 𝑚 if 𝑗 -th objects belongs to the 𝑚 -th component. These numbers are unknown, but we know the mixing probabilities 𝑝 𝑗;𝑛𝑚 = 𝑃{𝜅 𝑗 = 𝑚} . The 𝑋 𝑗 is a vector of regressors and 𝑌 𝑗 is a response in the regression model Here 𝑏 (𝑚) ∈ Θ ⊆ ℝ 𝑑 is a vector of unknown regression parameters for the 𝑚 -th component, the 𝑔: ℝ 𝐷−1 × Θ → ℝ is a known regression function, 𝜀 𝑗 is a regression error term. Random variables 𝑋 𝑗 and 𝜀 𝑗 are independent and their distribution is different","PeriodicalId":325859,"journal":{"name":"Proceedings of the 4th International Conference on Statistics: Theory and Applications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Statistics: Theory and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11159/icsta22.149","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Extended Abstract This paper continues our studies of the jackknife (JK) technique application for estimation of estimators’ covariance matrices in models of mixture with varying concentrations (MVC) [2, 3]. On JK applications for homogeneous samples, see [1]. In MVC models one deals with a non-homogeneous sample, which consists of subjects belonging to 𝑀 different sub-populations (mixture components). One knows the probabilities with which a subject belongs to the mixture components and these probabilities are different for different subjects. Therefore, the considered observations are independent but not identically distributed. We consider objects from a mixture with various concentrations. All objects from the sample Ξ 𝑛 belongs to one of M different mixture components. Each object from the sample 𝛯 𝑛 = (𝜉 𝑗 ) 𝑗=1 𝑛 has observed characteristics 𝜉 𝑗 = (𝑋 𝑗 , 𝑌 𝑗 ) ∈ ℝ 𝐷 and one hidden 𝜅 𝑗 . 𝜅 𝑗 = 𝑚 if 𝑗 -th objects belongs to the 𝑚 -th component. These numbers are unknown, but we know the mixing probabilities 𝑝 𝑗;𝑛𝑚 = 𝑃{𝜅 𝑗 = 𝑚} . The 𝑋 𝑗 is a vector of regressors and 𝑌 𝑗 is a response in the regression model Here 𝑏 (𝑚) ∈ Θ ⊆ ℝ 𝑑 is a vector of unknown regression parameters for the 𝑚 -th component, the 𝑔: ℝ 𝐷−1 × Θ → ℝ is a known regression function, 𝜀 𝑗 is a regression error term. Random variables 𝑋 𝑗 and 𝜀 𝑗 are independent and their distribution is different