International journal of neural systems Pub Date : 2025-04-01 Epub Date: 2025-02-19 DOI:10.1142/S0129065725500133
Yu Xue, Yan Lin, Ferrante Neri
{"title":"Architecture Knowledge Distillation for Evolutionary Generative Adversarial Network.","authors":"Yu Xue, Yan Lin, Ferrante Neri","doi":"10.1142/S0129065725500133","DOIUrl":null,"url":null,"abstract":"<p><p>Generative Adversarial Networks (GANs) are effective for image generation, but their unstable training limits broader applications. Additionally, neural architecture search (NAS) for GANs with one-shot models often leads to insufficient subnet training, where subnets inherit weights from a supernet without proper optimization, further degrading performance. To address both issues, we propose Architecture Knowledge Distillation for Evolutionary GAN (AKD-EGAN). AKD-EGAN operates in two stages. First, architecture knowledge distillation (AKD) is used during supernet training to efficiently optimize subnetworks and accelerate learning. Second, a multi-objective evolutionary algorithm (MOEA) searches for optimal subnet architectures, ensuring efficiency by considering multiple performance metrics. This approach, combined with a strategy for architecture inheritance, enhances GAN stability and image quality. Experiments show that AKD-EGAN surpasses state-of-the-art methods, achieving a Fréchet Inception Distance (FID) of 7.91 and an Inception Score (IS) of 8.97 on CIFAR-10, along with competitive results on STL-10 (FID: 20.32, IS: 10.06). Code and models will be available at https://github.com/njit-ly/AKD-EGAN.</p>","PeriodicalId":94052,"journal":{"name":"International journal of neural systems","volume":" ","pages":"2550013"},"PeriodicalIF":0.0000,"publicationDate":"2025-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"International journal of neural systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1142/S0129065725500133","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/2/19 0:00:00","PubModel":"Epub","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

生成对抗网络(GAN)对图像生成非常有效,但其不稳定的训练限制了其更广泛的应用。此外,使用单次模型对 GANs 进行神经架构搜索(NAS)往往会导致子网训练不足,子网会继承超级网的权重,而没有进行适当的优化,从而进一步降低性能。为了解决这两个问题,我们提出了进化式 GAN 的架构知识提炼(AKD-EGAN)。AKD-EGAN 分两个阶段运行。首先,在超级网络训练过程中使用架构知识蒸馏(AKD)来有效优化子网络并加速学习。其次,多目标进化算法(MOEA)搜索最佳子网架构,通过考虑多个性能指标确保效率。这种方法与架构继承策略相结合,提高了 GAN 的稳定性和图像质量。实验表明,AKD-EGAN 超越了最先进的方法,在 CIFAR-10 上达到了 7.91 的弗雷谢特起始距离(FID)和 8.97 的起始分数(IS),在 STL-10 上也取得了具有竞争力的结果(FID:20.32,IS:10.06)。代码和模型可在 https://github.com/njit-ly/AKD-EGAN 上查阅。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Architecture Knowledge Distillation for Evolutionary Generative Adversarial Network.

Generative Adversarial Networks (GANs) are effective for image generation, but their unstable training limits broader applications. Additionally, neural architecture search (NAS) for GANs with one-shot models often leads to insufficient subnet training, where subnets inherit weights from a supernet without proper optimization, further degrading performance. To address both issues, we propose Architecture Knowledge Distillation for Evolutionary GAN (AKD-EGAN). AKD-EGAN operates in two stages. First, architecture knowledge distillation (AKD) is used during supernet training to efficiently optimize subnetworks and accelerate learning. Second, a multi-objective evolutionary algorithm (MOEA) searches for optimal subnet architectures, ensuring efficiency by considering multiple performance metrics. This approach, combined with a strategy for architecture inheritance, enhances GAN stability and image quality. Experiments show that AKD-EGAN surpasses state-of-the-art methods, achieving a Fréchet Inception Distance (FID) of 7.91 and an Inception Score (IS) of 8.97 on CIFAR-10, along with competitive results on STL-10 (FID: 20.32, IS: 10.06). Code and models will be available at https://github.com/njit-ly/AKD-EGAN.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信