{"title":"A new shrinkage method for higher dimensions regression model to remedy of multicollinearity problem","authors":"Z. Ghareeb, Suhad Ali Shaheed Al-Temimi","doi":"10.21533/pen.v11i3.3550","DOIUrl":null,"url":null,"abstract":"This research seeks to present new method of shrinking variables to select some basic variables from large data sets. This new shrinkage estimator is a modification of (Ridge and Adaptive Lasso) shrinkage regression method in the presence of the mixing parameter that was calculated in the Elastic-Net. The Proposed estimator is called (Improved Mixed Shrinkage Estimator (IMSHE)) to handle the problem of multicollinearity. In practice, it is difficult to achieve the required accuracy and efficiency when dealing with a big data set, especially in the case of multicollinearity problem between the explanatory variables. By using Basic shrinkage methods (Lasso, Adaptive Lasso and Elastic Net) and comparing their results with the New shrinkage method (IMSH) was applied to a set of obesity -related data containing (52) variables for a sample of (112) observations. All shrinkage methods have also been compared for efficiency through Mean Square Error (MSE) criterion and Cross Validation Parameter (CVP). The results showed that the best shrinking parameter among the four methods (Lasso, Adaptive Lasso, Elastic Net and IMSH) was for the IMSH shrinkage method, as it corresponds to the lowest (MSE) based on the cross-validation parameter test (CVP). The new proposed method IMSH achieved the optimal shrinking parameter (λ = 0.6932827) according to the (CVP) test, that leads to have minimum value of mean square error (MSE) equal (0.2576002). The results showed when the value of the regularization parameter increases, the value of the shrinkage parameter decreases to become equal to zero, so the ideal number of variables after shrinkage is (p=6)","PeriodicalId":37519,"journal":{"name":"Periodicals of Engineering and Natural Sciences","volume":" ","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Periodicals of Engineering and Natural Sciences","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.21533/pen.v11i3.3550","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Engineering","Score":null,"Total":0}
引用次数: 0
Abstract
This research seeks to present new method of shrinking variables to select some basic variables from large data sets. This new shrinkage estimator is a modification of (Ridge and Adaptive Lasso) shrinkage regression method in the presence of the mixing parameter that was calculated in the Elastic-Net. The Proposed estimator is called (Improved Mixed Shrinkage Estimator (IMSHE)) to handle the problem of multicollinearity. In practice, it is difficult to achieve the required accuracy and efficiency when dealing with a big data set, especially in the case of multicollinearity problem between the explanatory variables. By using Basic shrinkage methods (Lasso, Adaptive Lasso and Elastic Net) and comparing their results with the New shrinkage method (IMSH) was applied to a set of obesity -related data containing (52) variables for a sample of (112) observations. All shrinkage methods have also been compared for efficiency through Mean Square Error (MSE) criterion and Cross Validation Parameter (CVP). The results showed that the best shrinking parameter among the four methods (Lasso, Adaptive Lasso, Elastic Net and IMSH) was for the IMSH shrinkage method, as it corresponds to the lowest (MSE) based on the cross-validation parameter test (CVP). The new proposed method IMSH achieved the optimal shrinking parameter (λ = 0.6932827) according to the (CVP) test, that leads to have minimum value of mean square error (MSE) equal (0.2576002). The results showed when the value of the regularization parameter increases, the value of the shrinkage parameter decreases to become equal to zero, so the ideal number of variables after shrinkage is (p=6)