用于索波列夫椭球体在线非参数回归的筛式随机梯度下降估计器。

IF 0.8 4区 历史学 0 ARCHAEOLOGY
Public Archaeology Pub Date : 2022-10-01 Epub Date: 2022-10-27 DOI:10.1214/22-aos2212
Tianyu Zhang, Noah Simon
{"title":"用于索波列夫椭球体在线非参数回归的筛式随机梯度下降估计器。","authors":"Tianyu Zhang, Noah Simon","doi":"10.1214/22-aos2212","DOIUrl":null,"url":null,"abstract":"<p><p>The goal of regression is to recover an unknown underlying function that best links a set of predictors to an outcome from noisy observations. in nonparametric regression, one assumes that the regression function belongs to a pre-specified infinite-dimensional function space (the hypothesis space). in the online setting, when the observations come in a stream, it is computationally-preferable to iteratively update an estimate rather than refitting an entire model repeatedly. inspired by nonparametric sieve estimation and stochastic approximation methods, we propose a sieve stochastic gradient descent estimator (Sieve-SGD) when the hypothesis space is a Sobolev ellipsoid. We show that Sieve-SGD has rate-optimal mean squared error (MSE) under a set of simple and direct conditions. The proposed estimator can be constructed with a low computational (time and space) expense: We also formally show that Sieve-SGD requires almost minimal memory usage among all statistically rate-optimal estimators.</p>","PeriodicalId":45023,"journal":{"name":"Public Archaeology","volume":"2 1","pages":"2848-2871"},"PeriodicalIF":0.8000,"publicationDate":"2022-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10760996/pdf/","citationCount":"3","resultStr":"{\"title\":\"A SIEVE STOCHASTIC GRADIENT DESCENT ESTIMATOR FOR ONLINE NONPARAMETRIC REGRESSION IN SOBOLEV ELLIPSOIDS.\",\"authors\":\"Tianyu Zhang, Noah Simon\",\"doi\":\"10.1214/22-aos2212\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The goal of regression is to recover an unknown underlying function that best links a set of predictors to an outcome from noisy observations. in nonparametric regression, one assumes that the regression function belongs to a pre-specified infinite-dimensional function space (the hypothesis space). in the online setting, when the observations come in a stream, it is computationally-preferable to iteratively update an estimate rather than refitting an entire model repeatedly. inspired by nonparametric sieve estimation and stochastic approximation methods, we propose a sieve stochastic gradient descent estimator (Sieve-SGD) when the hypothesis space is a Sobolev ellipsoid. We show that Sieve-SGD has rate-optimal mean squared error (MSE) under a set of simple and direct conditions. The proposed estimator can be constructed with a low computational (time and space) expense: We also formally show that Sieve-SGD requires almost minimal memory usage among all statistically rate-optimal estimators.</p>\",\"PeriodicalId\":45023,\"journal\":{\"name\":\"Public Archaeology\",\"volume\":\"2 1\",\"pages\":\"2848-2871\"},\"PeriodicalIF\":0.8000,\"publicationDate\":\"2022-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC10760996/pdf/\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Public Archaeology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1214/22-aos2212\",\"RegionNum\":4,\"RegionCategory\":\"历史学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2022/10/27 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"0\",\"JCRName\":\"ARCHAEOLOGY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Public Archaeology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1214/22-aos2212","RegionNum":4,"RegionCategory":"历史学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2022/10/27 0:00:00","PubModel":"Epub","JCR":"0","JCRName":"ARCHAEOLOGY","Score":null,"Total":0}
引用次数: 3

摘要

回归的目的是恢复未知的基本函数,该函数是将一组预测因子与噪声观测结果联系起来的最佳方法。在非参数回归中,我们假定回归函数属于一个预先指定的无限维函数空间(假设空间)。受非参数筛估计和随机逼近方法的启发,我们提出了一种筛随机梯度下降估计器(Sieve-SGD),当假设空间是一个 Sobolev 椭圆体时。我们证明,在一系列简单直接的条件下,Sieve-SGD 具有速率最优的均方误差(MSE)。所提出的估计器只需较低的计算(时间和空间)成本即可构建:我们还正式证明,在所有统计率最优估计器中,Sieve-SGD 所需的内存用量几乎是最小的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A SIEVE STOCHASTIC GRADIENT DESCENT ESTIMATOR FOR ONLINE NONPARAMETRIC REGRESSION IN SOBOLEV ELLIPSOIDS.

The goal of regression is to recover an unknown underlying function that best links a set of predictors to an outcome from noisy observations. in nonparametric regression, one assumes that the regression function belongs to a pre-specified infinite-dimensional function space (the hypothesis space). in the online setting, when the observations come in a stream, it is computationally-preferable to iteratively update an estimate rather than refitting an entire model repeatedly. inspired by nonparametric sieve estimation and stochastic approximation methods, we propose a sieve stochastic gradient descent estimator (Sieve-SGD) when the hypothesis space is a Sobolev ellipsoid. We show that Sieve-SGD has rate-optimal mean squared error (MSE) under a set of simple and direct conditions. The proposed estimator can be constructed with a low computational (time and space) expense: We also formally show that Sieve-SGD requires almost minimal memory usage among all statistically rate-optimal estimators.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Public Archaeology
Public Archaeology ARCHAEOLOGY-
CiteScore
2.70
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信