{"title":"Model Training","authors":"Raymond A. Anderson","doi":"10.1093/oso/9780192844194.003.0024","DOIUrl":null,"url":null,"abstract":"The chapter provides an approach and issues for model-training using Logistic Regression. (1) Regression—key model qualities plus i) options and settings, and ii) outputs to be expected/demanded. (2) Variable selection—i) criteria; ii) automation; iii) stepwise review; iv) constraining betas, where coefficients do not make sense; v) stepping by Gini, model pruning. (3) Correlation checks—i) multicollinearity—checks of variance inflation factors; ii) correlations—further checks to guard against the inclusion of highly correlated variables. (4) Blockwise variable selection—treatment in groups: i) variable reduction; ii) staged, or hierarchical regression; iii) embedded, model outputs as predictors; iv) ensemble, using outputs of other models. (5) Multi-model comparisons—Lorenz curves and strategy curves, should choices not be clear. (6) Calibration—i) simple adjustment by a constant; ii) piecewise, varying adjustments by the prediction; iii) score and points—adjusting the final score or constituent points; iv) MAPA, for more complex situations","PeriodicalId":286194,"journal":{"name":"Credit Intelligence & Modelling","volume":"111 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Credit Intelligence & Modelling","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1093/oso/9780192844194.003.0024","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The chapter provides an approach and issues for model-training using Logistic Regression. (1) Regression—key model qualities plus i) options and settings, and ii) outputs to be expected/demanded. (2) Variable selection—i) criteria; ii) automation; iii) stepwise review; iv) constraining betas, where coefficients do not make sense; v) stepping by Gini, model pruning. (3) Correlation checks—i) multicollinearity—checks of variance inflation factors; ii) correlations—further checks to guard against the inclusion of highly correlated variables. (4) Blockwise variable selection—treatment in groups: i) variable reduction; ii) staged, or hierarchical regression; iii) embedded, model outputs as predictors; iv) ensemble, using outputs of other models. (5) Multi-model comparisons—Lorenz curves and strategy curves, should choices not be clear. (6) Calibration—i) simple adjustment by a constant; ii) piecewise, varying adjustments by the prediction; iii) score and points—adjusting the final score or constituent points; iv) MAPA, for more complex situations