{"title":"具有AR(2)误差的一般线性回归模型的Kibria-Lukman估计:与Monte Carlo模拟的比较研究","authors":"T. SÖKÜT AÇAR","doi":"10.53570/jnt.1139885","DOIUrl":null,"url":null,"abstract":"The sensitivity of the least-squares estimation in a regression model is impacted by multicollinearity and autocorrelation problems. To deal with the multicollinearity, Ridge, Liu, and Ridge-type biased estimators have been presented in the statistical literature. The recently proposed Kibria-Lukman estimator is one of the Ridge-type estimators. The literature has compared the Kibria-Lukman estimator with the others using the mean square error criterion for the linear regression model. It was achieved in a study conducted on the Kibria-Lukman estimator's performance under the first-order autoregressive erroneous autocorrelation. When there is an autocorrelation problem with the second-order, evaluating the performance of the Kibria-Lukman estimator according to the mean square error criterion makes this paper original. The scalar mean square error of the Kibria-Lukman estimator under the second-order autoregressive error structure was evaluated using a Monte Carlo simulation and two real examples, and compared with the Generalized Least-squares, Ridge, and Liu estimators. \nThe findings revealed that when the variance of the model was small, the mean square error of the Kibria-Lukman estimator gave very close values with the popular biased estimators. As the model variance grew, Kibria-Lukman did not give fairly similar values with popular biased estimators as in the model with small variance. However, according to the mean square error criterion the Kibria-Lukman estimator outperformed the Generalized Least-Squares estimator in all possible cases.","PeriodicalId":347850,"journal":{"name":"Journal of New Theory","volume":"2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Kibria-Lukman Estimator for General Linear Regression Model with AR(2) Errors: A Comparative Study with Monte Carlo Simulation\",\"authors\":\"T. SÖKÜT AÇAR\",\"doi\":\"10.53570/jnt.1139885\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The sensitivity of the least-squares estimation in a regression model is impacted by multicollinearity and autocorrelation problems. To deal with the multicollinearity, Ridge, Liu, and Ridge-type biased estimators have been presented in the statistical literature. The recently proposed Kibria-Lukman estimator is one of the Ridge-type estimators. The literature has compared the Kibria-Lukman estimator with the others using the mean square error criterion for the linear regression model. It was achieved in a study conducted on the Kibria-Lukman estimator's performance under the first-order autoregressive erroneous autocorrelation. When there is an autocorrelation problem with the second-order, evaluating the performance of the Kibria-Lukman estimator according to the mean square error criterion makes this paper original. The scalar mean square error of the Kibria-Lukman estimator under the second-order autoregressive error structure was evaluated using a Monte Carlo simulation and two real examples, and compared with the Generalized Least-squares, Ridge, and Liu estimators. \\nThe findings revealed that when the variance of the model was small, the mean square error of the Kibria-Lukman estimator gave very close values with the popular biased estimators. As the model variance grew, Kibria-Lukman did not give fairly similar values with popular biased estimators as in the model with small variance. However, according to the mean square error criterion the Kibria-Lukman estimator outperformed the Generalized Least-Squares estimator in all possible cases.\",\"PeriodicalId\":347850,\"journal\":{\"name\":\"Journal of New Theory\",\"volume\":\"2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of New Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.53570/jnt.1139885\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of New Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.53570/jnt.1139885","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Kibria-Lukman Estimator for General Linear Regression Model with AR(2) Errors: A Comparative Study with Monte Carlo Simulation
The sensitivity of the least-squares estimation in a regression model is impacted by multicollinearity and autocorrelation problems. To deal with the multicollinearity, Ridge, Liu, and Ridge-type biased estimators have been presented in the statistical literature. The recently proposed Kibria-Lukman estimator is one of the Ridge-type estimators. The literature has compared the Kibria-Lukman estimator with the others using the mean square error criterion for the linear regression model. It was achieved in a study conducted on the Kibria-Lukman estimator's performance under the first-order autoregressive erroneous autocorrelation. When there is an autocorrelation problem with the second-order, evaluating the performance of the Kibria-Lukman estimator according to the mean square error criterion makes this paper original. The scalar mean square error of the Kibria-Lukman estimator under the second-order autoregressive error structure was evaluated using a Monte Carlo simulation and two real examples, and compared with the Generalized Least-squares, Ridge, and Liu estimators.
The findings revealed that when the variance of the model was small, the mean square error of the Kibria-Lukman estimator gave very close values with the popular biased estimators. As the model variance grew, Kibria-Lukman did not give fairly similar values with popular biased estimators as in the model with small variance. However, according to the mean square error criterion the Kibria-Lukman estimator outperformed the Generalized Least-Squares estimator in all possible cases.