{"title":"推荐(稳健)非线性回归方法的统计学习","authors":"J. Kalina, J. Tichavský","doi":"10.2478/jamsi-2019-0008","DOIUrl":null,"url":null,"abstract":"Abstract We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.","PeriodicalId":43016,"journal":{"name":"Journal of Applied Mathematics Statistics and Informatics","volume":"15 1","pages":"47 - 59"},"PeriodicalIF":0.3000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Statistical learning for recommending (robust) nonlinear regression methods\",\"authors\":\"J. Kalina, J. Tichavský\",\"doi\":\"10.2478/jamsi-2019-0008\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.\",\"PeriodicalId\":43016,\"journal\":{\"name\":\"Journal of Applied Mathematics Statistics and Informatics\",\"volume\":\"15 1\",\"pages\":\"47 - 59\"},\"PeriodicalIF\":0.3000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Applied Mathematics Statistics and Informatics\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2478/jamsi-2019-0008\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q4\",\"JCRName\":\"MATHEMATICS, APPLIED\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Applied Mathematics Statistics and Informatics","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2478/jamsi-2019-0008","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
Statistical learning for recommending (robust) nonlinear regression methods
Abstract We are interested in comparing the performance of various nonlinear estimators of parameters of the standard nonlinear regression model. While the standard nonlinear least squares estimator is vulnerable to the presence of outlying measurements in the data, there exist several robust alternatives. However, it is not clear which estimator should be used for a given dataset and this question remains extremely difficult (or perhaps infeasible) to be answered theoretically. Metalearning represents a computationally intensive methodology for optimal selection of algorithms (or methods) and is used here to predict the most suitable nonlinear estimator for a particular dataset. The classification rule is learned over a training database of 24 publicly available datasets. The results of the primary learning give an interesting argument in favor of the nonlinear least weighted squares estimator, which turns out to be the most suitable one for the majority of datasets. The subsequent metalearning reveals that tests of normality and heteroscedasticity play a crucial role in finding the most suitable nonlinear estimator.