{"title":"A quantum inspired approach to learning dynamical laws from data—block-sparsity and gauge-mediated weight sharing","authors":"J Fuksa, M Götte, I Roth, J Eisert","doi":"10.1088/2632-2153/ad4f4e","DOIUrl":null,"url":null,"abstract":"Recent years have witnessed an increased interest in recovering dynamical laws of complex systems in a largely data-driven fashion under meaningful hypotheses. In this work, we propose a scalable and numerically robust method for this task, utilizing efficient block-sparse tensor train representations of dynamical laws, inspired by similar approaches in quantum many-body systems. Low-rank tensor train representations have been previously derived for dynamical laws of one-dimensional systems. We extend this result to efficient representations of systems with <italic toggle=\"yes\">K</italic>-mode interactions and controlled approximations of systems with decaying interactions. We further argue that natural structure assumptions on dynamical laws, such as bounded polynomial degrees, can be exploited in the form of block-sparse support patterns of tensor-train cores. Additional structural similarities between interactions of certain modes can be accounted for by weight sharing within the ansatz. To make use of these structure assumptions, we propose a novel optimization algorithm, block-sparsity restricted alternating least squares with gauge-mediated weight sharing. The algorithm is inspired by similar notions in machine learning and achieves a significant improvement in performance over previous approaches. We demonstrate the performance of the method numerically on three one-dimensional systems—the Fermi–Pasta–Ulam–Tsingou system, rotating magnetic dipoles and point particles interacting via modified Lennard–Jones potentials, observing a highly accurate and noise-robust recovery.","PeriodicalId":33757,"journal":{"name":"Machine Learning Science and Technology","volume":null,"pages":null},"PeriodicalIF":6.3000,"publicationDate":"2024-06-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Machine Learning Science and Technology","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.1088/2632-2153/ad4f4e","RegionNum":2,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0
Abstract
Recent years have witnessed an increased interest in recovering dynamical laws of complex systems in a largely data-driven fashion under meaningful hypotheses. In this work, we propose a scalable and numerically robust method for this task, utilizing efficient block-sparse tensor train representations of dynamical laws, inspired by similar approaches in quantum many-body systems. Low-rank tensor train representations have been previously derived for dynamical laws of one-dimensional systems. We extend this result to efficient representations of systems with K-mode interactions and controlled approximations of systems with decaying interactions. We further argue that natural structure assumptions on dynamical laws, such as bounded polynomial degrees, can be exploited in the form of block-sparse support patterns of tensor-train cores. Additional structural similarities between interactions of certain modes can be accounted for by weight sharing within the ansatz. To make use of these structure assumptions, we propose a novel optimization algorithm, block-sparsity restricted alternating least squares with gauge-mediated weight sharing. The algorithm is inspired by similar notions in machine learning and achieves a significant improvement in performance over previous approaches. We demonstrate the performance of the method numerically on three one-dimensional systems—the Fermi–Pasta–Ulam–Tsingou system, rotating magnetic dipoles and point particles interacting via modified Lennard–Jones potentials, observing a highly accurate and noise-robust recovery.
近年来,人们越来越关注在有意义的假设条件下,以数据驱动的方式恢复复杂系统的动力学规律。受量子多体系统中类似方法的启发,我们在这项工作中提出了一种可扩展且数值稳健的方法,利用动态规律的高效块稀疏张量列车表示。低秩张量列车表示法之前已用于一维系统的动力学规律。我们将这一结果扩展到具有 K 模式相互作用的系统的高效表示,以及具有衰减相互作用的系统的受控近似。我们进一步论证了动力学规律的自然结构假设,例如有界多项式度,可以通过张量列车核心的块稀疏支持模式的形式加以利用。某些模式的相互作用之间的其他结构相似性可以通过反演中的权重共享来解释。为了利用这些结构假设,我们提出了一种新颖的优化算法--块稀疏性受限交替最小二乘法与轨距介导的权重共享。该算法受到机器学习中类似概念的启发,与之前的方法相比,性能有了显著提高。我们在三个一维系统--费米-帕斯塔-乌兰-钦古系统、旋转磁偶极子和通过修正的伦纳德-琼斯势相互作用的点粒子--上对该方法的性能进行了数值演示,观察到了高度精确和噪声抑制的恢复效果。
期刊介绍:
Machine Learning Science and Technology is a multidisciplinary open access journal that bridges the application of machine learning across the sciences with advances in machine learning methods and theory as motivated by physical insights. Specifically, articles must fall into one of the following categories: advance the state of machine learning-driven applications in the sciences or make conceptual, methodological or theoretical advances in machine learning with applications to, inspiration from, or motivated by scientific problems.