Beyond the Bias Variance Trade-Off: A Mutual Information Trade-Off in Deep Learning

Xinjie Lan, Bin Zhu, C. Boncelet, K. Barner
{"title":"Beyond the Bias Variance Trade-Off: A Mutual Information Trade-Off in Deep Learning","authors":"Xinjie Lan, Bin Zhu, C. Boncelet, K. Barner","doi":"10.1109/mlsp52302.2021.9596544","DOIUrl":null,"url":null,"abstract":"The classical bias variance trade-off cannot accurately explain how over-parameterized Deep Neural Networks (DNNs) avoid overfitting and achieve good generalization. To address the problem, we alternatively derive a Mutual Information (MI) trade-off based on the recently proposed MI explanation for generalization. In addition, we propose a probabilistic representation of DNNs for accurately estimating the MI. Compared to the classical bias variance trade-off, the MI trade-off not only accurately measures the generalization of over-parameterized DNNs but also formulates the relation between DNN architecture and generalization.","PeriodicalId":156116,"journal":{"name":"2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)","volume":"9 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-10-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 IEEE 31st International Workshop on Machine Learning for Signal Processing (MLSP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/mlsp52302.2021.9596544","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

The classical bias variance trade-off cannot accurately explain how over-parameterized Deep Neural Networks (DNNs) avoid overfitting and achieve good generalization. To address the problem, we alternatively derive a Mutual Information (MI) trade-off based on the recently proposed MI explanation for generalization. In addition, we propose a probabilistic representation of DNNs for accurately estimating the MI. Compared to the classical bias variance trade-off, the MI trade-off not only accurately measures the generalization of over-parameterized DNNs but also formulates the relation between DNN architecture and generalization.
超越偏差方差权衡:深度学习中的互信息权衡
经典的偏差方差权衡不能准确解释过参数化深度神经网络如何避免过拟合并实现良好的泛化。为了解决这个问题,我们在最近提出的MI解释的基础上推导出一个互信息(MI)权衡。此外,我们提出了一种深度神经网络的概率表示方法来准确估计深度神经网络的泛化程度。与经典的偏差方差权衡相比,深度神经网络权衡不仅准确地衡量了过参数化深度神经网络的泛化程度,而且还明确了深度神经网络架构与泛化之间的关系。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信