通过对连接性和功能的限制实现稀疏学习

Mirza M. Junaid Baig, Armen Stepanyants
{"title":"通过对连接性和功能的限制实现稀疏学习","authors":"Mirza M. Junaid Baig, Armen Stepanyants","doi":"arxiv-2409.04946","DOIUrl":null,"url":null,"abstract":"Sparse connectivity is a hallmark of the brain and a desired property of\nartificial neural networks. It promotes energy efficiency, simplifies training,\nand enhances the robustness of network function. Thus, a detailed understanding\nof how to achieve sparsity without jeopardizing network performance is\nbeneficial for neuroscience, deep learning, and neuromorphic computing\napplications. We used an exactly solvable model of associative learning to\nevaluate the effects of various sparsity-inducing constraints on connectivity\nand function. We determine the optimal level of sparsity achieved by the $l_0$\nnorm constraint and find that nearly the same efficiency can be obtained by\neliminating weak connections. We show that this method of achieving sparsity\ncan be implemented online, making it compatible with neuroscience and machine\nlearning applications.","PeriodicalId":501517,"journal":{"name":"arXiv - QuanBio - Neurons and Cognition","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Sparse learning enabled by constraints on connectivity and function\",\"authors\":\"Mirza M. Junaid Baig, Armen Stepanyants\",\"doi\":\"arxiv-2409.04946\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Sparse connectivity is a hallmark of the brain and a desired property of\\nartificial neural networks. It promotes energy efficiency, simplifies training,\\nand enhances the robustness of network function. Thus, a detailed understanding\\nof how to achieve sparsity without jeopardizing network performance is\\nbeneficial for neuroscience, deep learning, and neuromorphic computing\\napplications. We used an exactly solvable model of associative learning to\\nevaluate the effects of various sparsity-inducing constraints on connectivity\\nand function. We determine the optimal level of sparsity achieved by the $l_0$\\nnorm constraint and find that nearly the same efficiency can be obtained by\\neliminating weak connections. We show that this method of achieving sparsity\\ncan be implemented online, making it compatible with neuroscience and machine\\nlearning applications.\",\"PeriodicalId\":501517,\"journal\":{\"name\":\"arXiv - QuanBio - Neurons and Cognition\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - QuanBio - Neurons and Cognition\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.04946\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - QuanBio - Neurons and Cognition","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.04946","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

稀疏连接是大脑的标志,也是人工神经网络的理想特性。它能提高能效、简化训练并增强网络功能的鲁棒性。因此,详细了解如何在不影响网络性能的情况下实现稀疏性,对神经科学、深度学习和神经形态计算应用都是有益的。我们利用关联学习的精确可解模型来评估各种稀疏性约束对连接性和功能的影响。我们确定了通过 l_0$norm 约束实现的最佳稀疏程度,并发现通过消除弱连接可以获得几乎相同的效率。我们证明,这种实现稀疏性的方法可以在线实现,使其与神经科学和机器学习应用兼容。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Sparse learning enabled by constraints on connectivity and function
Sparse connectivity is a hallmark of the brain and a desired property of artificial neural networks. It promotes energy efficiency, simplifies training, and enhances the robustness of network function. Thus, a detailed understanding of how to achieve sparsity without jeopardizing network performance is beneficial for neuroscience, deep learning, and neuromorphic computing applications. We used an exactly solvable model of associative learning to evaluate the effects of various sparsity-inducing constraints on connectivity and function. We determine the optimal level of sparsity achieved by the $l_0$ norm constraint and find that nearly the same efficiency can be obtained by eliminating weak connections. We show that this method of achieving sparsity can be implemented online, making it compatible with neuroscience and machine learning applications.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信