{"title":"Minimax Regression with Bounded Noise","authors":"Yonina C. Eldar, A. Beck","doi":"10.1109/EEEI.2006.321098","DOIUrl":null,"url":null,"abstract":"We consider the problem of estimating a vector z in the regression model b = Az + w where w is an unknown but bounded noise and an upper bound on the norm of z is available. To estimate z we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors z. Relying on recent results regarding strong duality of nonconvex quadratic optimization problems with two quadratic constraints, we prove that in the complex domain our approach leads to the exact Chebyshev center. In the real domain, this strategy results in a \"pretty good\" approximation of the true Chebyshev center. As we show, our estimate can be viewed as a Tikhonov regularization with a special choice of parameter that can be found efficiently. We then demonstrate via numerical examples that our estimator can outperform other conventional methods, such as least-squares and regularized least-squares, with respect to the estimation error.","PeriodicalId":142814,"journal":{"name":"2006 IEEE 24th Convention of Electrical & Electronics Engineers in Israel","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2006-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2006 IEEE 24th Convention of Electrical & Electronics Engineers in Israel","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/EEEI.2006.321098","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
We consider the problem of estimating a vector z in the regression model b = Az + w where w is an unknown but bounded noise and an upper bound on the norm of z is available. To estimate z we propose a relaxation of the Chebyshev center, which is the vector that minimizes the worst-case estimation error over all feasible vectors z. Relying on recent results regarding strong duality of nonconvex quadratic optimization problems with two quadratic constraints, we prove that in the complex domain our approach leads to the exact Chebyshev center. In the real domain, this strategy results in a "pretty good" approximation of the true Chebyshev center. As we show, our estimate can be viewed as a Tikhonov regularization with a special choice of parameter that can be found efficiently. We then demonstrate via numerical examples that our estimator can outperform other conventional methods, such as least-squares and regularized least-squares, with respect to the estimation error.
我们考虑在回归模型b = Az + w中估计向量z的问题,其中w是一个未知但有界的噪声,并且z的范数有上界。为了估计z,我们提出了Chebyshev中心的松弛,Chebyshev中心是在所有可行向量z上最小化最坏情况估计误差的向量。依靠最近关于具有两个二次约束的非凸二次优化问题的强对偶性的结果,我们证明了在复域我们的方法导致精确的Chebyshev中心。在实域中,这种策略的结果与真正的切比雪夫中心“相当接近”。正如我们所展示的,我们的估计可以看作是一个吉洪诺夫正则化,具有可以有效找到的特殊参数选择。然后,我们通过数值例子证明,我们的估计器在估计误差方面优于其他传统方法,如最小二乘和正则化最小二乘。