{"title":"从最大熵原理推导出的贝叶斯推理和频数推理,在传播统计方法的不确定性方面的应用","authors":"David R. Bickel","doi":"10.1007/s00362-024-01597-3","DOIUrl":null,"url":null,"abstract":"<p>Using statistical methods to analyze data requires considering the data set to be randomly generated from a probability distribution that is unknown but idealized according to a mathematical model consisting of constraints, assumptions about the distribution. Since the choice of such a model is up to the scientist, there is an understandable bias toward choosing models that make scientific conclusions appear more certain than they really are. There is a similar bias in the scientist’s choice of whether to use Bayesian or frequentist methods. This article provides tools to mitigate both of those biases on the basis of a principle of information theory. It is found that the same principle unifies Bayesianism with the fiducial version of frequentism. The principle arguably overcomes not only the main objections against fiducial inference but also the main Bayesian objection against the use of confidence intervals.</p>","PeriodicalId":51166,"journal":{"name":"Statistical Papers","volume":"46 1","pages":""},"PeriodicalIF":1.2000,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bayesian and frequentist inference derived from the maximum entropy principle with applications to propagating uncertainty about statistical methods\",\"authors\":\"David R. Bickel\",\"doi\":\"10.1007/s00362-024-01597-3\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Using statistical methods to analyze data requires considering the data set to be randomly generated from a probability distribution that is unknown but idealized according to a mathematical model consisting of constraints, assumptions about the distribution. Since the choice of such a model is up to the scientist, there is an understandable bias toward choosing models that make scientific conclusions appear more certain than they really are. There is a similar bias in the scientist’s choice of whether to use Bayesian or frequentist methods. This article provides tools to mitigate both of those biases on the basis of a principle of information theory. It is found that the same principle unifies Bayesianism with the fiducial version of frequentism. The principle arguably overcomes not only the main objections against fiducial inference but also the main Bayesian objection against the use of confidence intervals.</p>\",\"PeriodicalId\":51166,\"journal\":{\"name\":\"Statistical Papers\",\"volume\":\"46 1\",\"pages\":\"\"},\"PeriodicalIF\":1.2000,\"publicationDate\":\"2024-08-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Statistical Papers\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s00362-024-01597-3\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Papers","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s00362-024-01597-3","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
Bayesian and frequentist inference derived from the maximum entropy principle with applications to propagating uncertainty about statistical methods
Using statistical methods to analyze data requires considering the data set to be randomly generated from a probability distribution that is unknown but idealized according to a mathematical model consisting of constraints, assumptions about the distribution. Since the choice of such a model is up to the scientist, there is an understandable bias toward choosing models that make scientific conclusions appear more certain than they really are. There is a similar bias in the scientist’s choice of whether to use Bayesian or frequentist methods. This article provides tools to mitigate both of those biases on the basis of a principle of information theory. It is found that the same principle unifies Bayesianism with the fiducial version of frequentism. The principle arguably overcomes not only the main objections against fiducial inference but also the main Bayesian objection against the use of confidence intervals.
期刊介绍:
The journal Statistical Papers addresses itself to all persons and organizations that have to deal with statistical methods in their own field of work. It attempts to provide a forum for the presentation and critical assessment of statistical methods, in particular for the discussion of their methodological foundations as well as their potential applications. Methods that have broad applications will be preferred. However, special attention is given to those statistical methods which are relevant to the economic and social sciences. In addition to original research papers, readers will find survey articles, short notes, reports on statistical software, problem section, and book reviews.