Y-Drop: A Conductance based Dropout for fully connected layers

Efthymios Georgiou, Georgios Paraskevopoulos, Alexandros Potamianos
{"title":"Y-Drop: A Conductance based Dropout for fully connected layers","authors":"Efthymios Georgiou, Georgios Paraskevopoulos, Alexandros Potamianos","doi":"arxiv-2409.09088","DOIUrl":null,"url":null,"abstract":"In this work, we introduce Y-Drop, a regularization method that biases the\ndropout algorithm towards dropping more important neurons with higher\nprobability. The backbone of our approach is neuron conductance, an\ninterpretable measure of neuron importance that calculates the contribution of\neach neuron towards the end-to-end mapping of the network. We investigate the\nimpact of the uniform dropout selection criterion on performance by assigning\nhigher dropout probability to the more important units. We show that forcing\nthe network to solve the task at hand in the absence of its important units\nyields a strong regularization effect. Further analysis indicates that Y-Drop\nyields solutions where more neurons are important, i.e have high conductance,\nand yields robust networks. In our experiments we show that the regularization\neffect of Y-Drop scales better than vanilla dropout w.r.t. the architecture\nsize and consistently yields superior performance over multiple datasets and\narchitecture combinations, with little tuning.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"190 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.09088","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In this work, we introduce Y-Drop, a regularization method that biases the dropout algorithm towards dropping more important neurons with higher probability. The backbone of our approach is neuron conductance, an interpretable measure of neuron importance that calculates the contribution of each neuron towards the end-to-end mapping of the network. We investigate the impact of the uniform dropout selection criterion on performance by assigning higher dropout probability to the more important units. We show that forcing the network to solve the task at hand in the absence of its important units yields a strong regularization effect. Further analysis indicates that Y-Drop yields solutions where more neurons are important, i.e have high conductance, and yields robust networks. In our experiments we show that the regularization effect of Y-Drop scales better than vanilla dropout w.r.t. the architecture size and consistently yields superior performance over multiple datasets and architecture combinations, with little tuning.
Y-Drop:基于电导的全连接层滤除器
在这项工作中,我们引入了 Y-Drop,这是一种正则化方法,它能使丢弃算法偏向于以更高的概率丢弃更重要的神经元。我们方法的支柱是神经元电导,这是一种可解释的神经元重要性度量,它计算每个神经元对网络端到端映射的贡献。我们通过为更重要的单元分配更高的辍学概率,研究了均匀辍学选择标准对性能的影响。我们发现,迫使网络在没有重要单元的情况下解决手头的任务会产生很强的正则化效应。进一步的分析表明,Y-正则化能产生更多重要神经元(即具有高传导性)的解决方案,并产生稳健的网络。在实验中,我们发现 Y-Drop 的正则化效果比 vanilla dropout 更好地扩展了架构规模,而且在多个数据集和架构组合中,Y-Drop 只需进行少量调整,就能始终如一地获得卓越性能。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信