{"title":"A Characterization of Optimal Prediction Measures via $\\ell_1$ Minimization","authors":"Len Bos","doi":"arxiv-2312.03091","DOIUrl":null,"url":null,"abstract":"Suppose that $K\\subset\\C$ is compact and that $z_0\\in\\C\\backslash K$ is an\nexternal point. An optimal prediction measure for regression by polynomials of\ndegree at most $n,$ is one for which the variance of the prediction at $z_0$ is\nas small as possible. Hoel and Levine (\\cite{HL}) have considered the case of\n$K=[-1,1]$ and $z_0=x_0\\in \\R\\backslash [-1,1],$ where they show that the\nsupport of the optimal measure is the $n+1$ extremme points of the Chebyshev\npolynomial $T_n(x)$ and characterizing the optimal weights in terms of absolute\nvalues of fundamental interpolating Lagrange polynomials. More recently,\n\\cite{BLO} has given the equivalence of the optimal prediction problem with\nthat of finding polynomials of extremal growth. They also study in detail the\ncase of $K=[-1,1]$ and $z_0=ia\\in i\\R,$ purely imaginary. In this work we\ngeneralize the Hoel-Levine formula to the general case when the support of the\noptimal measure is a finite set and give a formula for the optimal weights in\nterms of a $\\ell_1$ minimization problem.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"1 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-12-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2312.03091","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Suppose that $K\subset\C$ is compact and that $z_0\in\C\backslash K$ is an
external point. An optimal prediction measure for regression by polynomials of
degree at most $n,$ is one for which the variance of the prediction at $z_0$ is
as small as possible. Hoel and Levine (\cite{HL}) have considered the case of
$K=[-1,1]$ and $z_0=x_0\in \R\backslash [-1,1],$ where they show that the
support of the optimal measure is the $n+1$ extremme points of the Chebyshev
polynomial $T_n(x)$ and characterizing the optimal weights in terms of absolute
values of fundamental interpolating Lagrange polynomials. More recently,
\cite{BLO} has given the equivalence of the optimal prediction problem with
that of finding polynomials of extremal growth. They also study in detail the
case of $K=[-1,1]$ and $z_0=ia\in i\R,$ purely imaginary. In this work we
generalize the Hoel-Levine formula to the general case when the support of the
optimal measure is a finite set and give a formula for the optimal weights in
terms of a $\ell_1$ minimization problem.