真实最小复杂性水库的普遍性

Robert Simon Fong, Boyu Li, Peter Tiňo
{"title":"真实最小复杂性水库的普遍性","authors":"Robert Simon Fong, Boyu Li, Peter Tiňo","doi":"arxiv-2408.08071","DOIUrl":null,"url":null,"abstract":"Reservoir Computing (RC) models, a subclass of recurrent neural networks, are\ndistinguished by their fixed, non-trainable input layer and dynamically coupled\nreservoir, with only the static readout layer being trained. This design\ncircumvents the issues associated with backpropagating error signals through\ntime, thereby enhancing both stability and training efficiency. RC models have\nbeen successfully applied across a broad range of application domains.\nCrucially, they have been demonstrated to be universal approximators of\ntime-invariant dynamic filters with fading memory, under various settings of\napproximation norms and input driving sources. Simple Cycle Reservoirs (SCR) represent a specialized class of RC models with\na highly constrained reservoir architecture, characterized by uniform ring\nconnectivity and binary input-to-reservoir weights with an aperiodic sign\npattern. For linear reservoirs, given the reservoir size, the reservoir\nconstruction has only one degree of freedom -- the reservoir cycle weight. Such\narchitectures are particularly amenable to hardware implementations without\nsignificant performance degradation in many practical tasks. In this study we\nendow these observations with solid theoretical foundations by proving that\nSCRs operating in real domain are universal approximators of time-invariant\ndynamic filters with fading memory. Our results supplement recent research\nshowing that SCRs in the complex domain can approximate, to arbitrary\nprecision, any unrestricted linear reservoir with a non-linear readout. We\nfurthermore introduce a novel method to drastically reduce the number of SCR\nunits, making such highly constrained architectures natural candidates for\nlow-complexity hardware implementations. Our findings are supported by\nempirical studies on real-world time series datasets.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"87 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Universality of Real Minimal Complexity Reservoir\",\"authors\":\"Robert Simon Fong, Boyu Li, Peter Tiňo\",\"doi\":\"arxiv-2408.08071\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Reservoir Computing (RC) models, a subclass of recurrent neural networks, are\\ndistinguished by their fixed, non-trainable input layer and dynamically coupled\\nreservoir, with only the static readout layer being trained. This design\\ncircumvents the issues associated with backpropagating error signals through\\ntime, thereby enhancing both stability and training efficiency. RC models have\\nbeen successfully applied across a broad range of application domains.\\nCrucially, they have been demonstrated to be universal approximators of\\ntime-invariant dynamic filters with fading memory, under various settings of\\napproximation norms and input driving sources. Simple Cycle Reservoirs (SCR) represent a specialized class of RC models with\\na highly constrained reservoir architecture, characterized by uniform ring\\nconnectivity and binary input-to-reservoir weights with an aperiodic sign\\npattern. For linear reservoirs, given the reservoir size, the reservoir\\nconstruction has only one degree of freedom -- the reservoir cycle weight. Such\\narchitectures are particularly amenable to hardware implementations without\\nsignificant performance degradation in many practical tasks. In this study we\\nendow these observations with solid theoretical foundations by proving that\\nSCRs operating in real domain are universal approximators of time-invariant\\ndynamic filters with fading memory. Our results supplement recent research\\nshowing that SCRs in the complex domain can approximate, to arbitrary\\nprecision, any unrestricted linear reservoir with a non-linear readout. We\\nfurthermore introduce a novel method to drastically reduce the number of SCR\\nunits, making such highly constrained architectures natural candidates for\\nlow-complexity hardware implementations. Our findings are supported by\\nempirical studies on real-world time series datasets.\",\"PeriodicalId\":501347,\"journal\":{\"name\":\"arXiv - CS - Neural and Evolutionary Computing\",\"volume\":\"87 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-08-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - CS - Neural and Evolutionary Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2408.08071\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.08071","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

储层计算(RC)模型是递归神经网络的一个子类,其与众不同之处在于固定的、不可训练的输入层和动态耦合的储层,只有静态的读出层需要训练。这种设计避免了误差信号随时间反向传播的问题,从而提高了稳定性和训练效率。最重要的是,在各种逼近规范和输入驱动源设置下,它们已被证明是具有衰减记忆的时间不变动态滤波器的通用逼近器。简单循环蓄水池(SCR)代表了一类具有高度受限蓄水池结构的专用 RC 模型,其特点是均匀的环连接性和具有非周期性符号模式的二进制输入到蓄水池权重。对于线性储层,给定储层大小,储层结构只有一个自由度--储层循环权重。这种架构特别适合硬件实现,在许多实际任务中不会出现明显的性能下降。在本研究中,我们证明了在实域中运行的储层结构是具有衰减记忆的时变动态滤波器的通用近似器,从而为这些观察结果提供了坚实的理论基础。我们的研究结果补充了最近的研究,即复数域中的可控硅可以任意精度逼近任何具有非线性读出的无限制线性水库。我们还引入了一种新方法来大幅减少 SCR 单元的数量,从而使这种高度受限的架构成为低复杂度硬件实现的天然候选者。我们的研究结果得到了真实世界时间序列数据集实证研究的支持。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Universality of Real Minimal Complexity Reservoir
Reservoir Computing (RC) models, a subclass of recurrent neural networks, are distinguished by their fixed, non-trainable input layer and dynamically coupled reservoir, with only the static readout layer being trained. This design circumvents the issues associated with backpropagating error signals through time, thereby enhancing both stability and training efficiency. RC models have been successfully applied across a broad range of application domains. Crucially, they have been demonstrated to be universal approximators of time-invariant dynamic filters with fading memory, under various settings of approximation norms and input driving sources. Simple Cycle Reservoirs (SCR) represent a specialized class of RC models with a highly constrained reservoir architecture, characterized by uniform ring connectivity and binary input-to-reservoir weights with an aperiodic sign pattern. For linear reservoirs, given the reservoir size, the reservoir construction has only one degree of freedom -- the reservoir cycle weight. Such architectures are particularly amenable to hardware implementations without significant performance degradation in many practical tasks. In this study we endow these observations with solid theoretical foundations by proving that SCRs operating in real domain are universal approximators of time-invariant dynamic filters with fading memory. Our results supplement recent research showing that SCRs in the complex domain can approximate, to arbitrary precision, any unrestricted linear reservoir with a non-linear readout. We furthermore introduce a novel method to drastically reduce the number of SCR units, making such highly constrained architectures natural candidates for low-complexity hardware implementations. Our findings are supported by empirical studies on real-world time series datasets.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信