流数据的自适应随机优化:带有O(dN)运算的牛顿方法

Antoine Godichon-BaggioniLPSM, Nicklas Werge
{"title":"流数据的自适应随机优化:带有O(dN)运算的牛顿方法","authors":"Antoine Godichon-BaggioniLPSM, Nicklas Werge","doi":"arxiv-2311.17753","DOIUrl":null,"url":null,"abstract":"Stochastic optimization methods encounter new challenges in the realm of\nstreaming, characterized by a continuous flow of large, high-dimensional data.\nWhile first-order methods, like stochastic gradient descent, are the natural\nchoice, they often struggle with ill-conditioned problems. In contrast,\nsecond-order methods, such as Newton's methods, offer a potential solution, but\ntheir computational demands render them impractical. This paper introduces\nadaptive stochastic optimization methods that bridge the gap between addressing\nill-conditioned problems while functioning in a streaming context. Notably, we\npresent an adaptive inversion-free Newton's method with a computational\ncomplexity matching that of first-order methods, $\\mathcal{O}(dN)$, where $d$\nrepresents the number of dimensions/features, and $N$ the number of data.\nTheoretical analysis confirms their asymptotic efficiency, and empirical\nevidence demonstrates their effectiveness, especially in scenarios involving\ncomplex covariance structures and challenging initializations. In particular,\nour adaptive Newton's methods outperform existing methods, while maintaining\nfavorable computational efficiency.","PeriodicalId":501330,"journal":{"name":"arXiv - MATH - Statistics Theory","volume":"89 5","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2023-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On Adaptive Stochastic Optimization for Streaming Data: A Newton's Method with O(dN) Operations\",\"authors\":\"Antoine Godichon-BaggioniLPSM, Nicklas Werge\",\"doi\":\"arxiv-2311.17753\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Stochastic optimization methods encounter new challenges in the realm of\\nstreaming, characterized by a continuous flow of large, high-dimensional data.\\nWhile first-order methods, like stochastic gradient descent, are the natural\\nchoice, they often struggle with ill-conditioned problems. In contrast,\\nsecond-order methods, such as Newton's methods, offer a potential solution, but\\ntheir computational demands render them impractical. This paper introduces\\nadaptive stochastic optimization methods that bridge the gap between addressing\\nill-conditioned problems while functioning in a streaming context. Notably, we\\npresent an adaptive inversion-free Newton's method with a computational\\ncomplexity matching that of first-order methods, $\\\\mathcal{O}(dN)$, where $d$\\nrepresents the number of dimensions/features, and $N$ the number of data.\\nTheoretical analysis confirms their asymptotic efficiency, and empirical\\nevidence demonstrates their effectiveness, especially in scenarios involving\\ncomplex covariance structures and challenging initializations. In particular,\\nour adaptive Newton's methods outperform existing methods, while maintaining\\nfavorable computational efficiency.\",\"PeriodicalId\":501330,\"journal\":{\"name\":\"arXiv - MATH - Statistics Theory\",\"volume\":\"89 5\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-11-29\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - MATH - Statistics Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2311.17753\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Statistics Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2311.17753","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

随机优化方法在大、高维数据连续流动的流领域遇到了新的挑战。虽然一阶方法,如随机梯度下降,是自然的选择,但它们经常与病态问题作斗争。相比之下,二阶方法,如牛顿的方法,提供了一个潜在的解决方案,但它们的计算要求使它们不切实际。本文介绍了自适应随机优化方法,该方法在处理病态问题的同时在流环境中发挥作用。值得注意的是,我们提出了一种自适应无反转牛顿方法,其计算复杂度与一阶方法相匹配,$\mathcal{O}(dN)$,其中$d$表示维度/特征的数量,$N$表示数据的数量。理论分析证实了它们的渐近效率,经验证据证明了它们的有效性,特别是在涉及复杂协方差结构和具有挑战性的初始化的情况下。特别是,我们的自适应牛顿方法在保持良好的计算效率的同时,优于现有的方法。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
On Adaptive Stochastic Optimization for Streaming Data: A Newton's Method with O(dN) Operations
Stochastic optimization methods encounter new challenges in the realm of streaming, characterized by a continuous flow of large, high-dimensional data. While first-order methods, like stochastic gradient descent, are the natural choice, they often struggle with ill-conditioned problems. In contrast, second-order methods, such as Newton's methods, offer a potential solution, but their computational demands render them impractical. This paper introduces adaptive stochastic optimization methods that bridge the gap between addressing ill-conditioned problems while functioning in a streaming context. Notably, we present an adaptive inversion-free Newton's method with a computational complexity matching that of first-order methods, $\mathcal{O}(dN)$, where $d$ represents the number of dimensions/features, and $N$ the number of data. Theoretical analysis confirms their asymptotic efficiency, and empirical evidence demonstrates their effectiveness, especially in scenarios involving complex covariance structures and challenging initializations. In particular, our adaptive Newton's methods outperform existing methods, while maintaining favorable computational efficiency.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信