{"title":"Construction of regression experiment optimal plan using parallel computing","authors":"L. Vladimirova, I. Fatyanova","doi":"10.1109/SCP.2015.7342140","DOIUrl":null,"url":null,"abstract":"In this paper, we consider the classical linear regression of the second order, the unknown parameters are usually evaluated by the method of least squares. The distribution of the error of parameter vector estimate depends on the plan choice. This choice is carried out to minimize the generalized variance of unknown parameters estimate or to maximize the information matrix determinant. To solve this extremal problem the random search is used on the basis of on the normal distribution. This method takes into account the information on the objective function by the use of covariance matrix. This method is iterative; at each iteration the search domain is gradually contracted round the point recognized to be most promising at previous iteration. So we have self-training method (named the method with a “memory”). The algorithm is simple and can be used for large dimension of search domain. In addition, this method is suitable for parallelization by distributing of numerical statistical tests among the processes [1, 2].","PeriodicalId":110366,"journal":{"name":"2015 International Conference \"Stability and Control Processes\" in Memory of V.I. Zubov (SCP)","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2015-12-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2015 International Conference \"Stability and Control Processes\" in Memory of V.I. Zubov (SCP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SCP.2015.7342140","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 2
Abstract
In this paper, we consider the classical linear regression of the second order, the unknown parameters are usually evaluated by the method of least squares. The distribution of the error of parameter vector estimate depends on the plan choice. This choice is carried out to minimize the generalized variance of unknown parameters estimate or to maximize the information matrix determinant. To solve this extremal problem the random search is used on the basis of on the normal distribution. This method takes into account the information on the objective function by the use of covariance matrix. This method is iterative; at each iteration the search domain is gradually contracted round the point recognized to be most promising at previous iteration. So we have self-training method (named the method with a “memory”). The algorithm is simple and can be used for large dimension of search domain. In addition, this method is suitable for parallelization by distributing of numerical statistical tests among the processes [1, 2].