Román Salmerón‐Gómez, Catalina B. García‐García, José García‐Pérez
{"title":"提升回归:理由、特性和应用","authors":"Román Salmerón‐Gómez, Catalina B. García‐García, José García‐Pérez","doi":"10.1111/insr.12575","DOIUrl":null,"url":null,"abstract":"SummaryMulticollinearity results in inflation in the variance of the ordinary least squares estimators due to the correlation between two or more independent variables (including the constant term). A widely applied solution is to estimate with penalised estimators such as the ridge estimator, which trade off some bias in the estimators to gain a reduction in the variance of these estimators. Although the variance diminishes with these procedures, all seem to indicate that the inference and goodness of fit are controversial. Alternatively, the raise regression allows mitigation of the problems associated with multicollinearity without the loss of inference or the coefficient of determination. This paper completely formalises the raise estimator. For the first time, the norm of the estimator, the behaviour of the individual and joint significance, the behaviour of the mean squared error and the coefficient of variation are analysed. We also present the generalisation of the estimation and the relation between the raise and the residualisation estimators. To have a better understanding of raise regression, previous contributions are also summarised: its mean squared error, the variance inflation factor, the condition number, adequate selection of the variable to be raised, the successive raising, and the relation between the raise and the ridge estimator. The usefulness of the raise regression as an alternative to mitigate multicollinearity is illustrated with two empirical applications.","PeriodicalId":14479,"journal":{"name":"International Statistical Review","volume":null,"pages":null},"PeriodicalIF":1.7000,"publicationDate":"2024-05-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Raise Regression: Justification, Properties and Application\",\"authors\":\"Román Salmerón‐Gómez, Catalina B. García‐García, José García‐Pérez\",\"doi\":\"10.1111/insr.12575\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SummaryMulticollinearity results in inflation in the variance of the ordinary least squares estimators due to the correlation between two or more independent variables (including the constant term). A widely applied solution is to estimate with penalised estimators such as the ridge estimator, which trade off some bias in the estimators to gain a reduction in the variance of these estimators. Although the variance diminishes with these procedures, all seem to indicate that the inference and goodness of fit are controversial. Alternatively, the raise regression allows mitigation of the problems associated with multicollinearity without the loss of inference or the coefficient of determination. This paper completely formalises the raise estimator. For the first time, the norm of the estimator, the behaviour of the individual and joint significance, the behaviour of the mean squared error and the coefficient of variation are analysed. We also present the generalisation of the estimation and the relation between the raise and the residualisation estimators. To have a better understanding of raise regression, previous contributions are also summarised: its mean squared error, the variance inflation factor, the condition number, adequate selection of the variable to be raised, the successive raising, and the relation between the raise and the ridge estimator. The usefulness of the raise regression as an alternative to mitigate multicollinearity is illustrated with two empirical applications.\",\"PeriodicalId\":14479,\"journal\":{\"name\":\"International Statistical Review\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":1.7000,\"publicationDate\":\"2024-05-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Statistical Review\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1111/insr.12575\",\"RegionNum\":3,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"STATISTICS & PROBABILITY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Statistical Review","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1111/insr.12575","RegionNum":3,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"STATISTICS & PROBABILITY","Score":null,"Total":0}
The Raise Regression: Justification, Properties and Application
SummaryMulticollinearity results in inflation in the variance of the ordinary least squares estimators due to the correlation between two or more independent variables (including the constant term). A widely applied solution is to estimate with penalised estimators such as the ridge estimator, which trade off some bias in the estimators to gain a reduction in the variance of these estimators. Although the variance diminishes with these procedures, all seem to indicate that the inference and goodness of fit are controversial. Alternatively, the raise regression allows mitigation of the problems associated with multicollinearity without the loss of inference or the coefficient of determination. This paper completely formalises the raise estimator. For the first time, the norm of the estimator, the behaviour of the individual and joint significance, the behaviour of the mean squared error and the coefficient of variation are analysed. We also present the generalisation of the estimation and the relation between the raise and the residualisation estimators. To have a better understanding of raise regression, previous contributions are also summarised: its mean squared error, the variance inflation factor, the condition number, adequate selection of the variable to be raised, the successive raising, and the relation between the raise and the ridge estimator. The usefulness of the raise regression as an alternative to mitigate multicollinearity is illustrated with two empirical applications.
期刊介绍:
International Statistical Review is the flagship journal of the International Statistical Institute (ISI) and of its family of Associations. It publishes papers of broad and general interest in statistics and probability. The term Review is to be interpreted broadly. The types of papers that are suitable for publication include (but are not limited to) the following: reviews/surveys of significant developments in theory, methodology, statistical computing and graphics, statistical education, and application areas; tutorials on important topics; expository papers on emerging areas of research or application; papers describing new developments and/or challenges in relevant areas; papers addressing foundational issues; papers on the history of statistics and probability; white papers on topics of importance to the profession or society; and historical assessment of seminal papers in the field and their impact.