神经记忆常微分方程的前向学习算法

International journal of neural systems Pub Date : 2024-09-01 Epub Date: 2024-06-21 DOI:10.1142/S0129065724500485
Xiuyuan Xu, Haiying Luo, Zhang Yi, Haixian Zhang
{"title":"神经记忆常微分方程的前向学习算法","authors":"Xiuyuan Xu, Haiying Luo, Zhang Yi, Haixian Zhang","doi":"10.1142/S0129065724500485","DOIUrl":null,"url":null,"abstract":"<p><p>The deep neural network, based on the backpropagation learning algorithm, has achieved tremendous success. However, the backpropagation algorithm is consistently considered biologically implausible. Many efforts have recently been made to address these biological implausibility issues, nevertheless, these methods are tailored to discrete neural network structures. Continuous neural networks are crucial for investigating novel neural network models with more biologically dynamic characteristics and for interpretability of large language models. The neural memory ordinary differential equation (nmODE) is a recently proposed continuous neural network model that exhibits several intriguing properties. In this study, we present a forward-learning algorithm, called nmForwardLA, for nmODE. This algorithm boasts lower computational dimensions and greater efficiency. Compared with the other learning algorithms, experimental results on MNIST, CIFAR10, and CIFAR100 demonstrate its potency.</p>","PeriodicalId":94052,"journal":{"name":"International journal of neural systems","volume":" ","pages":"2450048"},"PeriodicalIF":0.0000,"publicationDate":"2024-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Forward Learning Algorithm for Neural Memory Ordinary Differential Equations.\",\"authors\":\"Xiuyuan Xu, Haiying Luo, Zhang Yi, Haixian Zhang\",\"doi\":\"10.1142/S0129065724500485\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>The deep neural network, based on the backpropagation learning algorithm, has achieved tremendous success. However, the backpropagation algorithm is consistently considered biologically implausible. Many efforts have recently been made to address these biological implausibility issues, nevertheless, these methods are tailored to discrete neural network structures. Continuous neural networks are crucial for investigating novel neural network models with more biologically dynamic characteristics and for interpretability of large language models. The neural memory ordinary differential equation (nmODE) is a recently proposed continuous neural network model that exhibits several intriguing properties. In this study, we present a forward-learning algorithm, called nmForwardLA, for nmODE. This algorithm boasts lower computational dimensions and greater efficiency. Compared with the other learning algorithms, experimental results on MNIST, CIFAR10, and CIFAR100 demonstrate its potency.</p>\",\"PeriodicalId\":94052,\"journal\":{\"name\":\"International journal of neural systems\",\"volume\":\" \",\"pages\":\"2450048\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International journal of neural systems\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1142/S0129065724500485\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"2024/6/21 0:00:00\",\"PubModel\":\"Epub\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of neural systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/S0129065724500485","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2024/6/21 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

基于反向传播学习算法的深度神经网络取得了巨大成功。然而,反向传播算法一直被认为在生物学上是不可信的。尽管如此,这些方法都是针对离散神经网络结构量身定制的。连续神经网络对于研究具有更多生物动态特征的新型神经网络模型以及大型语言模型的可解释性至关重要。神经记忆常微分方程(nmODE)是最近提出的一种连续神经网络模型,它表现出了一些耐人寻味的特性。在本研究中,我们针对 nmODE 提出了一种名为 nmForwardLA 的前向学习算法。该算法具有更低的计算维度和更高的效率。与其他学习算法相比,在 MNIST、CIFAR10 和 CIFAR100 上的实验结果证明了该算法的有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Forward Learning Algorithm for Neural Memory Ordinary Differential Equations.

The deep neural network, based on the backpropagation learning algorithm, has achieved tremendous success. However, the backpropagation algorithm is consistently considered biologically implausible. Many efforts have recently been made to address these biological implausibility issues, nevertheless, these methods are tailored to discrete neural network structures. Continuous neural networks are crucial for investigating novel neural network models with more biologically dynamic characteristics and for interpretability of large language models. The neural memory ordinary differential equation (nmODE) is a recently proposed continuous neural network model that exhibits several intriguing properties. In this study, we present a forward-learning algorithm, called nmForwardLA, for nmODE. This algorithm boasts lower computational dimensions and greater efficiency. Compared with the other learning algorithms, experimental results on MNIST, CIFAR10, and CIFAR100 demonstrate its potency.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信