Breaking Neural Network Scaling Laws with Modularity

Akhilan Boopathy, Sunshine Jiang, William Yue, Jaedong Hwang, Abhiram Iyer, Ila Fiete
{"title":"Breaking Neural Network Scaling Laws with Modularity","authors":"Akhilan Boopathy, Sunshine Jiang, William Yue, Jaedong Hwang, Abhiram Iyer, Ila Fiete","doi":"arxiv-2409.05780","DOIUrl":null,"url":null,"abstract":"Modular neural networks outperform nonmodular neural networks on tasks\nranging from visual question answering to robotics. These performance\nimprovements are thought to be due to modular networks' superior ability to\nmodel the compositional and combinatorial structure of real-world problems.\nHowever, a theoretical explanation of how modularity improves generalizability,\nand how to leverage task modularity while training networks remains elusive.\nUsing recent theoretical progress in explaining neural network generalization,\nwe investigate how the amount of training data required to generalize on a task\nvaries with the intrinsic dimensionality of a task's input. We show\ntheoretically that when applied to modularly structured tasks, while nonmodular\nnetworks require an exponential number of samples with task dimensionality,\nmodular networks' sample complexity is independent of task dimensionality:\nmodular networks can generalize in high dimensions. We then develop a novel\nlearning rule for modular networks to exploit this advantage and empirically\nshow the improved generalization of the rule, both in- and out-of-distribution,\non high-dimensional, modular tasks.","PeriodicalId":501340,"journal":{"name":"arXiv - STAT - Machine Learning","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-09","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - STAT - Machine Learning","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.05780","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Modular neural networks outperform nonmodular neural networks on tasks ranging from visual question answering to robotics. These performance improvements are thought to be due to modular networks' superior ability to model the compositional and combinatorial structure of real-world problems. However, a theoretical explanation of how modularity improves generalizability, and how to leverage task modularity while training networks remains elusive. Using recent theoretical progress in explaining neural network generalization, we investigate how the amount of training data required to generalize on a task varies with the intrinsic dimensionality of a task's input. We show theoretically that when applied to modularly structured tasks, while nonmodular networks require an exponential number of samples with task dimensionality, modular networks' sample complexity is independent of task dimensionality: modular networks can generalize in high dimensions. We then develop a novel learning rule for modular networks to exploit this advantage and empirically show the improved generalization of the rule, both in- and out-of-distribution, on high-dimensional, modular tasks.
用模块化打破神经网络扩展法则
模块化神经网络在从视觉问题解答到机器人等任务中的表现优于非模块化神经网络。我们利用最近在解释神经网络泛化方面取得的理论进展,研究了泛化任务所需的训练数据量如何随任务输入的内在维度而变化。我们从理论上证明,当应用于模块化结构的任务时,非模块化网络需要的样本数量与任务维度成指数关系,而模块化网络的样本复杂度与任务维度无关:模块化网络可以在高维度上泛化。然后,我们为模块化网络开发了一种新的学习规则,以利用这一优势,并通过实证证明了该规则在高维模块化任务中的泛化能力,无论是在分布内还是分布外。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信