{"title":"Generative modeling through internal high-dimensional chaotic activity","authors":"Samantha J. Fournier, Pierfrancesco Urbani","doi":"arxiv-2405.10822","DOIUrl":null,"url":null,"abstract":"Generative modeling aims at producing new datapoints whose statistical\nproperties resemble the ones in a training dataset. In recent years, there has\nbeen a burst of machine learning techniques and settings that can achieve this\ngoal with remarkable performances. In most of these settings, one uses the\ntraining dataset in conjunction with noise, which is added as a source of\nstatistical variability and is essential for the generative task. Here, we\nexplore the idea of using internal chaotic dynamics in high-dimensional chaotic\nsystems as a way to generate new datapoints from a training dataset. We show\nthat simple learning rules can achieve this goal within a set of vanilla\narchitectures and characterize the quality of the generated datapoints through\nstandard accuracy measures.","PeriodicalId":501066,"journal":{"name":"arXiv - PHYS - Disordered Systems and Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-05-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - PHYS - Disordered Systems and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2405.10822","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Generative modeling aims at producing new datapoints whose statistical
properties resemble the ones in a training dataset. In recent years, there has
been a burst of machine learning techniques and settings that can achieve this
goal with remarkable performances. In most of these settings, one uses the
training dataset in conjunction with noise, which is added as a source of
statistical variability and is essential for the generative task. Here, we
explore the idea of using internal chaotic dynamics in high-dimensional chaotic
systems as a way to generate new datapoints from a training dataset. We show
that simple learning rules can achieve this goal within a set of vanilla
architectures and characterize the quality of the generated datapoints through
standard accuracy measures.