{"title":"马尔可夫链蒙特卡罗算法","authors":"J. Rosenthal","doi":"10.1109/WITS.1994.513879","DOIUrl":null,"url":null,"abstract":"We briefly describe Markov chain Monte Carlo algorithms, such as the Gibbs sampler and the Metropolis-Hastings (1953, 1970) algorithm, which are frequently used in the statistics literature to explore complicated probability distributions. We present a general method for proving rigorous, a priori bounds an the number of iterations required to achieve convergence of the algorithms.","PeriodicalId":423518,"journal":{"name":"Proceedings of 1994 Workshop on Information Theory and Statistics","volume":"88 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1994-10-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Markov chain Monte Carlo algorithms\",\"authors\":\"J. Rosenthal\",\"doi\":\"10.1109/WITS.1994.513879\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We briefly describe Markov chain Monte Carlo algorithms, such as the Gibbs sampler and the Metropolis-Hastings (1953, 1970) algorithm, which are frequently used in the statistics literature to explore complicated probability distributions. We present a general method for proving rigorous, a priori bounds an the number of iterations required to achieve convergence of the algorithms.\",\"PeriodicalId\":423518,\"journal\":{\"name\":\"Proceedings of 1994 Workshop on Information Theory and Statistics\",\"volume\":\"88 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1994-10-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of 1994 Workshop on Information Theory and Statistics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/WITS.1994.513879\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of 1994 Workshop on Information Theory and Statistics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/WITS.1994.513879","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
We briefly describe Markov chain Monte Carlo algorithms, such as the Gibbs sampler and the Metropolis-Hastings (1953, 1970) algorithm, which are frequently used in the statistics literature to explore complicated probability distributions. We present a general method for proving rigorous, a priori bounds an the number of iterations required to achieve convergence of the algorithms.