Randomized low-rank Runge-Kutta methods

Hei Yin Lam, Gianluca Ceruti, Daniel Kressner
{"title":"Randomized low-rank Runge-Kutta methods","authors":"Hei Yin Lam, Gianluca Ceruti, Daniel Kressner","doi":"arxiv-2409.06384","DOIUrl":null,"url":null,"abstract":"This work proposes and analyzes a new class of numerical integrators for\ncomputing low-rank approximations to solutions of matrix differential equation.\nWe combine an explicit Runge-Kutta method with repeated randomized low-rank\napproximation to keep the rank of the stages limited. The so-called generalized\nNystr\\\"om method is particularly well suited for this purpose; it builds\nlow-rank approximations from random sketches of the discretized dynamics. In\ncontrast, all existing dynamical low-rank approximation methods are\ndeterministic and usually perform tangent space projections to limit rank\ngrowth. Using such tangential projections can result in larger error compared\nto approximating the dynamics directly. Moreover, sketching allows for\nincreased flexibility and efficiency by choosing structured random matrices\nadapted to the structure of the matrix differential equation. Under suitable\nassumptions, we establish moment and tail bounds on the error of our randomized\nlow-rank Runge-Kutta methods. When combining the classical Runge-Kutta method\nwith generalized Nystr\\\"om, we obtain a method called Rand RK4, which exhibits\nfourth-order convergence numerically -- up to the low-rank approximation error.\nFor a modified variant of Rand RK4, we also establish fourth-order convergence\ntheoretically. Numerical experiments for a range of examples from the\nliterature demonstrate that randomized low-rank Runge-Kutta methods compare\nfavorably with two popular dynamical low-rank approximation methods, in terms\nof robustness and speed of convergence.","PeriodicalId":501162,"journal":{"name":"arXiv - MATH - Numerical Analysis","volume":"55 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - MATH - Numerical Analysis","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06384","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

This work proposes and analyzes a new class of numerical integrators for computing low-rank approximations to solutions of matrix differential equation. We combine an explicit Runge-Kutta method with repeated randomized low-rank approximation to keep the rank of the stages limited. The so-called generalized Nystr\"om method is particularly well suited for this purpose; it builds low-rank approximations from random sketches of the discretized dynamics. In contrast, all existing dynamical low-rank approximation methods are deterministic and usually perform tangent space projections to limit rank growth. Using such tangential projections can result in larger error compared to approximating the dynamics directly. Moreover, sketching allows for increased flexibility and efficiency by choosing structured random matrices adapted to the structure of the matrix differential equation. Under suitable assumptions, we establish moment and tail bounds on the error of our randomized low-rank Runge-Kutta methods. When combining the classical Runge-Kutta method with generalized Nystr\"om, we obtain a method called Rand RK4, which exhibits fourth-order convergence numerically -- up to the low-rank approximation error. For a modified variant of Rand RK4, we also establish fourth-order convergence theoretically. Numerical experiments for a range of examples from the literature demonstrate that randomized low-rank Runge-Kutta methods compare favorably with two popular dynamical low-rank approximation methods, in terms of robustness and speed of convergence.
随机低阶 Runge-Kutta 方法
我们将显式 Runge-Kutta 方法与重复随机低秩近似法相结合,以限制各阶段的秩。所谓的广义 Nystr\"om 方法特别适合这一目的;它从离散动力学的随机草图中建立低阶近似。相比之下,所有现有的动力学低阶近似方法都是确定性的,通常执行切向空间投影来限制阶数增长。与直接逼近动力学相比,使用这种切向投影会导致更大的误差。此外,草图法通过选择与矩阵微分方程结构相适应的结构化随机矩阵,提高了灵活性和效率。在合适的假设条件下,我们建立了随机低秩 Runge-Kutta 方法的误差矩界和尾界。当把经典 Runge-Kutta 方法与广义 Nystr\"om 结合起来时,我们得到了一种称为 Rand RK4 的方法,它在数值上表现出四阶收敛性--直到低阶近似误差为止。对文献中一系列例子的数值实验表明,随机低阶 Runge-Kutta 方法与两种流行的动态低阶近似方法相比,在鲁棒性和收敛速度方面都更胜一筹。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信