方案:随机收敛启发式期望最大化估计

M. A. Tope, Joel M. Morris
{"title":"方案:随机收敛启发式期望最大化估计","authors":"M. A. Tope, Joel M. Morris","doi":"10.1109/CISS.2014.6814110","DOIUrl":null,"url":null,"abstract":"This paper introduces a modification to the EM (Expectation Maximization) algorithm potentially allowing reliable convergence to the ML (Maximum Likelihood) parameter estimate for a set of previously intractable problems. The modification is based on the MCEM (Monte Carlo EM) algorithm, which substitutes sample averages for the explicit calculation of expectation. A problem with previous algorithms is that the number of samples required for convergence and the generally convergence behavior was uncertain. Using information geometric principles, we arrive at a new formulation that ensures convergence with probability one. Further, we begin an investigation attempting to minimize the number of samples required to obtain an acceptable approximation of the ML estimate. This algorithm is well suited to solve numerous challenging statistical problems.","PeriodicalId":169460,"journal":{"name":"2014 48th Annual Conference on Information Sciences and Systems (CISS)","volume":"48 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-03-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"SCHEME: Stochastically convergent heuristical Expectation Maximization estimation\",\"authors\":\"M. A. Tope, Joel M. Morris\",\"doi\":\"10.1109/CISS.2014.6814110\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"This paper introduces a modification to the EM (Expectation Maximization) algorithm potentially allowing reliable convergence to the ML (Maximum Likelihood) parameter estimate for a set of previously intractable problems. The modification is based on the MCEM (Monte Carlo EM) algorithm, which substitutes sample averages for the explicit calculation of expectation. A problem with previous algorithms is that the number of samples required for convergence and the generally convergence behavior was uncertain. Using information geometric principles, we arrive at a new formulation that ensures convergence with probability one. Further, we begin an investigation attempting to minimize the number of samples required to obtain an acceptable approximation of the ML estimate. This algorithm is well suited to solve numerous challenging statistical problems.\",\"PeriodicalId\":169460,\"journal\":{\"name\":\"2014 48th Annual Conference on Information Sciences and Systems (CISS)\",\"volume\":\"48 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2014-03-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2014 48th Annual Conference on Information Sciences and Systems (CISS)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CISS.2014.6814110\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2014 48th Annual Conference on Information Sciences and Systems (CISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CISS.2014.6814110","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

本文介绍了对EM(期望最大化)算法的一种修改,可能允许可靠地收敛到ML(最大似然)参数估计,用于一组先前棘手的问题。修正是基于MCEM(蒙特卡罗EM)算法,用样本平均值代替显式计算期望。以前算法的一个问题是收敛所需的样本数量和一般的收敛行为是不确定的。利用信息几何原理,我们得到了一个保证收敛概率为1的新公式。此外,我们开始了一项调查,试图最小化获得ML估计的可接受近似值所需的样本数量。该算法非常适合解决许多具有挑战性的统计问题。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
SCHEME: Stochastically convergent heuristical Expectation Maximization estimation
This paper introduces a modification to the EM (Expectation Maximization) algorithm potentially allowing reliable convergence to the ML (Maximum Likelihood) parameter estimate for a set of previously intractable problems. The modification is based on the MCEM (Monte Carlo EM) algorithm, which substitutes sample averages for the explicit calculation of expectation. A problem with previous algorithms is that the number of samples required for convergence and the generally convergence behavior was uncertain. Using information geometric principles, we arrive at a new formulation that ensures convergence with probability one. Further, we begin an investigation attempting to minimize the number of samples required to obtain an acceptable approximation of the ML estimate. This algorithm is well suited to solve numerous challenging statistical problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信