Jackknife Estimator Consistency for Nonlinear Mixture

R. Maiboroda, Vitaliy MIroshnychenko
{"title":"Jackknife Estimator Consistency for Nonlinear Mixture","authors":"R. Maiboroda, Vitaliy MIroshnychenko","doi":"10.11159/icsta22.149","DOIUrl":null,"url":null,"abstract":"Extended Abstract This paper continues our studies of the jackknife (JK) technique application for estimation of estimators’ covariance matrices in models of mixture with varying concentrations (MVC) [2, 3]. On JK applications for homogeneous samples, see [1]. In MVC models one deals with a non-homogeneous sample, which consists of subjects belonging to 𝑀 different sub-populations (mixture components). One knows the probabilities with which a subject belongs to the mixture components and these probabilities are different for different subjects. Therefore, the considered observations are independent but not identically distributed. We consider objects from a mixture with various concentrations. All objects from the sample Ξ 𝑛 belongs to one of M different mixture components. Each object from the sample 𝛯 𝑛 = (𝜉 𝑗 ) 𝑗=1 𝑛 has observed characteristics 𝜉 𝑗 = (𝑋 𝑗 , 𝑌 𝑗 ) ∈ ℝ 𝐷 and one hidden 𝜅 𝑗 . 𝜅 𝑗 = 𝑚 if 𝑗 -th objects belongs to the 𝑚 -th component. These numbers are unknown, but we know the mixing probabilities 𝑝 𝑗;𝑛𝑚 = 𝑃{𝜅 𝑗 = 𝑚} . The 𝑋 𝑗 is a vector of regressors and 𝑌 𝑗 is a response in the regression model Here 𝑏 (𝑚) ∈ Θ ⊆ ℝ 𝑑 is a vector of unknown regression parameters for the 𝑚 -th component, the 𝑔: ℝ 𝐷−1 × Θ → ℝ is a known regression function, 𝜀 𝑗 is a regression error term. Random variables 𝑋 𝑗 and 𝜀 𝑗 are independent and their distribution is different","PeriodicalId":325859,"journal":{"name":"Proceedings of the 4th International Conference on Statistics: Theory and Applications","volume":"23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 4th International Conference on Statistics: Theory and Applications","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.11159/icsta22.149","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Extended Abstract This paper continues our studies of the jackknife (JK) technique application for estimation of estimators’ covariance matrices in models of mixture with varying concentrations (MVC) [2, 3]. On JK applications for homogeneous samples, see [1]. In MVC models one deals with a non-homogeneous sample, which consists of subjects belonging to 𝑀 different sub-populations (mixture components). One knows the probabilities with which a subject belongs to the mixture components and these probabilities are different for different subjects. Therefore, the considered observations are independent but not identically distributed. We consider objects from a mixture with various concentrations. All objects from the sample Ξ 𝑛 belongs to one of M different mixture components. Each object from the sample 𝛯 𝑛 = (𝜉 𝑗 ) 𝑗=1 𝑛 has observed characteristics 𝜉 𝑗 = (𝑋 𝑗 , 𝑌 𝑗 ) ∈ ℝ 𝐷 and one hidden 𝜅 𝑗 . 𝜅 𝑗 = 𝑚 if 𝑗 -th objects belongs to the 𝑚 -th component. These numbers are unknown, but we know the mixing probabilities 𝑝 𝑗;𝑛𝑚 = 𝑃{𝜅 𝑗 = 𝑚} . The 𝑋 𝑗 is a vector of regressors and 𝑌 𝑗 is a response in the regression model Here 𝑏 (𝑚) ∈ Θ ⊆ ℝ 𝑑 is a vector of unknown regression parameters for the 𝑚 -th component, the 𝑔: ℝ 𝐷−1 × Θ → ℝ is a known regression function, 𝜀 𝑗 is a regression error term. Random variables 𝑋 𝑗 and 𝜀 𝑗 are independent and their distribution is different
非线性混合的刀切估计相合性
本文继续研究了叠刀(JK)技术在变浓度混合模型(MVC)中估计量协方差矩阵估计中的应用[2,3]。关于均匀样本的JK应用程序,请参见[1]。在MVC模型中,我们处理的是非同质样本,它由属于𝑀不同子种群(混合组件)的主题组成。我们知道一个主体属于混合成分的概率这些概率对于不同的主体是不同的。因此,所考虑的观测值是独立的,但不是均匀分布的。我们考虑来自不同浓度混合物的物体。样本Ξ𝑛中的所有对象都属于M种不同混合成分中的一种。每个对象的示例𝛯𝑛=(𝜉𝑗)𝑗= 1𝑛观察特征𝜉𝑗=(𝑋𝑗,𝑌𝑗)∈ℝ𝐷和一个隐藏𝜅𝑗。如果𝑗-该对象属于𝑚-该组件,则𝜅𝑗=𝑚。这些数字是未知的,但我们知道混合概率𝑝𝑗;𝑛𝑚= {𝜅𝑗=𝑚}。的𝑋𝑗是解释变量的向量和𝑌𝑗是这里的回归模型的响应𝑏(𝑚)∈Θ⊆ℝ𝑑是未知的向量回归参数𝑚th组件,𝑔:ℝ𝐷−1×Θ→ℝ是一个已知的回归函数,𝜀𝑗回归误差项。随机变量𝑋𝑗和𝑗是独立的,它们的分布是不同的
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信