{"title":"鲁棒回归的一些最新进展综述","authors":"R. Wilcox","doi":"10.1111/J.2044-8317.1996.TB01088.X","DOIUrl":null,"url":null,"abstract":"In situations where the goal is to understand how a random variable y is related to a set of p predictor variables, modern robust regression methods can be invaluable. One reason is that even one unusual value in the design space, or one outlier among the y values, can have a large impact on the ordinary least squares estimate of the parameters of the usual linear model. That is, a single unusual value or outlier can give a highly distorted view of how two or more random variables are related. Another reason is that modern robust methods can be much more efficient than ordinary least squares yet maintain good efficiency under the ideal conditions of normality and a homoscedastic error term. Even when sampling is from light-tailed distributions, there are situations where certain robust methods are highly efficient compared to least squares, as is indicated in this paper. Most applied researchers in psychology simply ignore these problems. In the hope of improving current practice, this paper reviews some of the robust methods currently available with an emphasis on recent developments. Of particular interest are methods for computing confidence intervals and dealing with heteroscedasticity in the error term.","PeriodicalId":229922,"journal":{"name":"British Journal of Mathematical and Statistical Psychology","volume":"39 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1996-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"12","resultStr":"{\"title\":\"A review of some recent developments in robust regression\",\"authors\":\"R. Wilcox\",\"doi\":\"10.1111/J.2044-8317.1996.TB01088.X\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In situations where the goal is to understand how a random variable y is related to a set of p predictor variables, modern robust regression methods can be invaluable. One reason is that even one unusual value in the design space, or one outlier among the y values, can have a large impact on the ordinary least squares estimate of the parameters of the usual linear model. That is, a single unusual value or outlier can give a highly distorted view of how two or more random variables are related. Another reason is that modern robust methods can be much more efficient than ordinary least squares yet maintain good efficiency under the ideal conditions of normality and a homoscedastic error term. Even when sampling is from light-tailed distributions, there are situations where certain robust methods are highly efficient compared to least squares, as is indicated in this paper. Most applied researchers in psychology simply ignore these problems. In the hope of improving current practice, this paper reviews some of the robust methods currently available with an emphasis on recent developments. Of particular interest are methods for computing confidence intervals and dealing with heteroscedasticity in the error term.\",\"PeriodicalId\":229922,\"journal\":{\"name\":\"British Journal of Mathematical and Statistical Psychology\",\"volume\":\"39 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1996-11-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"12\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"British Journal of Mathematical and Statistical Psychology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1111/J.2044-8317.1996.TB01088.X\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"British Journal of Mathematical and Statistical Psychology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1111/J.2044-8317.1996.TB01088.X","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A review of some recent developments in robust regression
In situations where the goal is to understand how a random variable y is related to a set of p predictor variables, modern robust regression methods can be invaluable. One reason is that even one unusual value in the design space, or one outlier among the y values, can have a large impact on the ordinary least squares estimate of the parameters of the usual linear model. That is, a single unusual value or outlier can give a highly distorted view of how two or more random variables are related. Another reason is that modern robust methods can be much more efficient than ordinary least squares yet maintain good efficiency under the ideal conditions of normality and a homoscedastic error term. Even when sampling is from light-tailed distributions, there are situations where certain robust methods are highly efficient compared to least squares, as is indicated in this paper. Most applied researchers in psychology simply ignore these problems. In the hope of improving current practice, this paper reviews some of the robust methods currently available with an emphasis on recent developments. Of particular interest are methods for computing confidence intervals and dealing with heteroscedasticity in the error term.