{"title":"Parametric estimation and inference","authors":"M. Edge","doi":"10.1093/oso/9780198827627.003.0011","DOIUrl":null,"url":null,"abstract":"If it is reasonable to assume that the data are generated by a fully parametric model, then maximum-likelihood approaches to estimation and inference have many appealing properties. Maximum-likelihood estimators are obtained by identifying parameters that maximize the likelihood function, which can be done using calculus or using numerical approaches. Such estimators are consistent, and if the costs of errors in estimation are described by a squared-error loss function, then they are also efficient compared with their consistent competitors. The sampling variance of a maximum-likelihood estimate can be estimated in various ways. As always, one possibility is the bootstrap. In many models, the variance of the maximum-likelihood estimator can be derived directly once its form is known. A third approach is to rely on general properties of maximum-likelihood estimators and use the Fisher information. Similarly, there are many ways to test hypotheses about parameters estimated by maximum likelihood. This chapter discusses the Wald test, which relies on the fact that the sampling distribution of maximum-likelihood estimators is normal in large samples, and the likelihood-ratio test, which is a general approach for testing hypotheses relating nested pairs of models.","PeriodicalId":192186,"journal":{"name":"Statistical Thinking from Scratch","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-06-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistical Thinking from Scratch","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780198827627.003.0011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
If it is reasonable to assume that the data are generated by a fully parametric model, then maximum-likelihood approaches to estimation and inference have many appealing properties. Maximum-likelihood estimators are obtained by identifying parameters that maximize the likelihood function, which can be done using calculus or using numerical approaches. Such estimators are consistent, and if the costs of errors in estimation are described by a squared-error loss function, then they are also efficient compared with their consistent competitors. The sampling variance of a maximum-likelihood estimate can be estimated in various ways. As always, one possibility is the bootstrap. In many models, the variance of the maximum-likelihood estimator can be derived directly once its form is known. A third approach is to rely on general properties of maximum-likelihood estimators and use the Fisher information. Similarly, there are many ways to test hypotheses about parameters estimated by maximum likelihood. This chapter discusses the Wald test, which relies on the fact that the sampling distribution of maximum-likelihood estimators is normal in large samples, and the likelihood-ratio test, which is a general approach for testing hypotheses relating nested pairs of models.