对抗性深度神经网络中单纯形压缩的流行

IF 9.4 1区 综合性期刊 Q1 MULTIDISCIPLINARY SCIENCES
Yang Cao, Yanbo Chen, Weiwei Liu
{"title":"对抗性深度神经网络中单纯形压缩的流行","authors":"Yang Cao, Yanbo Chen, Weiwei Liu","doi":"10.1073/pnas.2421593122","DOIUrl":null,"url":null,"abstract":"Neural collapse (NC) reveals that the last layer of the network can capture data representations, leading to similar outputs for examples within the same class, while outputs for examples from different classes form a simplex equiangular tight frame (ETF) structure. This phenomenon has garnered significant attention due to its implications on the intrinsic properties of neural networks. Interestingly, we observe a simplex compression phenomenon in NC, where the geometric size of the simplex ETF reduces under adversarial training, with the degree of compression increasing as the perturbation radius grows. We provide empirical evidence supporting the existence of simplex compression across a wide range of models and datasets. Furthermore, we establish a rigorous theoretical framework that explains our experimental observations, offering insights into NC under adversarial conditions.","PeriodicalId":20548,"journal":{"name":"Proceedings of the National Academy of Sciences of the United States of America","volume":"48 1","pages":""},"PeriodicalIF":9.4000,"publicationDate":"2025-04-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Prevalence of simplex compression in adversarial deep neural networks\",\"authors\":\"Yang Cao, Yanbo Chen, Weiwei Liu\",\"doi\":\"10.1073/pnas.2421593122\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Neural collapse (NC) reveals that the last layer of the network can capture data representations, leading to similar outputs for examples within the same class, while outputs for examples from different classes form a simplex equiangular tight frame (ETF) structure. This phenomenon has garnered significant attention due to its implications on the intrinsic properties of neural networks. Interestingly, we observe a simplex compression phenomenon in NC, where the geometric size of the simplex ETF reduces under adversarial training, with the degree of compression increasing as the perturbation radius grows. We provide empirical evidence supporting the existence of simplex compression across a wide range of models and datasets. Furthermore, we establish a rigorous theoretical framework that explains our experimental observations, offering insights into NC under adversarial conditions.\",\"PeriodicalId\":20548,\"journal\":{\"name\":\"Proceedings of the National Academy of Sciences of the United States of America\",\"volume\":\"48 1\",\"pages\":\"\"},\"PeriodicalIF\":9.4000,\"publicationDate\":\"2025-04-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the National Academy of Sciences of the United States of America\",\"FirstCategoryId\":\"103\",\"ListUrlMain\":\"https://doi.org/10.1073/pnas.2421593122\",\"RegionNum\":1,\"RegionCategory\":\"综合性期刊\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"MULTIDISCIPLINARY SCIENCES\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the National Academy of Sciences of the United States of America","FirstCategoryId":"103","ListUrlMain":"https://doi.org/10.1073/pnas.2421593122","RegionNum":1,"RegionCategory":"综合性期刊","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0

摘要

神经崩溃(NC)揭示了网络的最后一层可以捕获数据表示,导致同一类中的示例的相似输出,而来自不同类的示例的输出形成单纯等角紧框架(ETF)结构。这一现象由于其对神经网络内在特性的影响而引起了极大的关注。有趣的是,我们在NC中观察到一种单纯形压缩现象,在对抗训练下,单纯形ETF的几何尺寸减小,压缩程度随着扰动半径的增加而增加。我们提供了经验证据,支持在广泛的模型和数据集上存在单纯形压缩。此外,我们建立了一个严谨的理论框架来解释我们的实验观察结果,为对抗条件下的NC提供了见解。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Prevalence of simplex compression in adversarial deep neural networks
Neural collapse (NC) reveals that the last layer of the network can capture data representations, leading to similar outputs for examples within the same class, while outputs for examples from different classes form a simplex equiangular tight frame (ETF) structure. This phenomenon has garnered significant attention due to its implications on the intrinsic properties of neural networks. Interestingly, we observe a simplex compression phenomenon in NC, where the geometric size of the simplex ETF reduces under adversarial training, with the degree of compression increasing as the perturbation radius grows. We provide empirical evidence supporting the existence of simplex compression across a wide range of models and datasets. Furthermore, we establish a rigorous theoretical framework that explains our experimental observations, offering insights into NC under adversarial conditions.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
19.00
自引率
0.90%
发文量
3575
审稿时长
2.5 months
期刊介绍: The Proceedings of the National Academy of Sciences (PNAS), a peer-reviewed journal of the National Academy of Sciences (NAS), serves as an authoritative source for high-impact, original research across the biological, physical, and social sciences. With a global scope, the journal welcomes submissions from researchers worldwide, making it an inclusive platform for advancing scientific knowledge.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信