The Supervised Information Bottleneck.

IF 2.1 3区 物理与天体物理 Q2 PHYSICS, MULTIDISCIPLINARY
Entropy Pub Date : 2025-04-22 DOI:10.3390/e27050452
Nir Z Weingarten, Zohar Yakhini, Moshe Butman, Ronit Bustin
{"title":"The Supervised Information Bottleneck.","authors":"Nir Z Weingarten, Zohar Yakhini, Moshe Butman, Ronit Bustin","doi":"10.3390/e27050452","DOIUrl":null,"url":null,"abstract":"<p><p>The Information Bottleneck (IB) framework offers a theoretically optimal approach to data modeling, although it is often intractable. Recent efforts have optimized supervised deep neural networks (DNNs) using a variational upper bound on the IB objective, leading to enhanced robustness to adversarial attacks. In these studies, supervision assumes a dual role: sometimes as a presumably constant and observed random variable and at other times as its variational approximation. This work proposes an extension to the IB framework and, consequent to the derivation of its variational bound, that resolves this duality. Applying the resulting bound as an objective for supervised DNNs induces empirical improvements and provides an information-theoretic motivation for decoder regularization.</p>","PeriodicalId":11694,"journal":{"name":"Entropy","volume":"27 5","pages":""},"PeriodicalIF":2.1000,"publicationDate":"2025-04-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC12110060/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Entropy","FirstCategoryId":"101","ListUrlMain":"https://doi.org/10.3390/e27050452","RegionNum":3,"RegionCategory":"物理与天体物理","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"PHYSICS, MULTIDISCIPLINARY","Score":null,"Total":0}
引用次数: 0

Abstract

The Information Bottleneck (IB) framework offers a theoretically optimal approach to data modeling, although it is often intractable. Recent efforts have optimized supervised deep neural networks (DNNs) using a variational upper bound on the IB objective, leading to enhanced robustness to adversarial attacks. In these studies, supervision assumes a dual role: sometimes as a presumably constant and observed random variable and at other times as its variational approximation. This work proposes an extension to the IB framework and, consequent to the derivation of its variational bound, that resolves this duality. Applying the resulting bound as an objective for supervised DNNs induces empirical improvements and provides an information-theoretic motivation for decoder regularization.

监督信息瓶颈。
信息瓶颈(Information Bottleneck, IB)框架为数据建模提供了一种理论上最优的方法,尽管它通常很棘手。最近的研究使用IB目标的变分上界来优化监督深度神经网络(dnn),从而增强了对对抗性攻击的鲁棒性。在这些研究中,监督承担着双重角色:有时作为一个假定的常数和观察到的随机变量,有时作为它的变分近似。这项工作提出了对IB框架的扩展,并根据其变分界的推导,解决了这种对偶性。将结果界作为监督深度神经网络的目标,可以诱导经验改进,并为解码器正则化提供信息论动机。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Entropy
Entropy PHYSICS, MULTIDISCIPLINARY-
CiteScore
4.90
自引率
11.10%
发文量
1580
审稿时长
21.05 days
期刊介绍: Entropy (ISSN 1099-4300), an international and interdisciplinary journal of entropy and information studies, publishes reviews, regular research papers and short notes. Our aim is to encourage scientists to publish as much as possible their theoretical and experimental details. There is no restriction on the length of the papers. If there are computation and the experiment, the details must be provided so that the results can be reproduced.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信