基于小波的双任务网络

IF 10.2 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Fuzhi Wu, Jiasong Wu, Chen Zhang, Youyong Kong, Chunfeng Yang, Guanyu Yang, Huazhong Shu, Guy Carrault, Lotfi Senhadji
{"title":"基于小波的双任务网络","authors":"Fuzhi Wu, Jiasong Wu, Chen Zhang, Youyong Kong, Chunfeng Yang, Guanyu Yang, Huazhong Shu, Guy Carrault, Lotfi Senhadji","doi":"10.1109/TNNLS.2024.3486330","DOIUrl":null,"url":null,"abstract":"<p><p>In image processing, wavelet transform (WT) offers multiscale image decomposition, generating a blend of low-resolution approximation images and high-resolution detail components. Drawing parallels to this concept, we view feature maps in convolutional neural networks (CNNs) as a similar mix, but uniquely within the channel domain. Inspired by multitask learning (MTL) principles, we propose a wavelet-based dual-task (WDT) framework. This novel framework employs WT in the channel domain to split a single task into two parallel tasks, thereby reforming traditional single-task CNNs into dynamic dual-task networks. Our WDT framework integrates seamlessly with various popular network architectures, enhancing their versatility and efficiency. It offers a more rational approach to resource allocation in CNNs, balancing between low-frequency and high-frequency information. Rigorous experiments on Cifar10, ImageNet, HMDB51, and UCF101 validate our approach's effectiveness. Results reveal significant improvements in the performance of traditional CNNs on classification tasks, and notably, these enhancements are achieved with fewer parameters and computations. In summary, our work presents a pioneering step toward redefining the performance and efficiency of CNN-based tasks through WT.</p>","PeriodicalId":13303,"journal":{"name":"IEEE transactions on neural networks and learning systems","volume":"PP ","pages":""},"PeriodicalIF":10.2000,"publicationDate":"2024-11-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Wavelet-Based Dual-Task Network.\",\"authors\":\"Fuzhi Wu, Jiasong Wu, Chen Zhang, Youyong Kong, Chunfeng Yang, Guanyu Yang, Huazhong Shu, Guy Carrault, Lotfi Senhadji\",\"doi\":\"10.1109/TNNLS.2024.3486330\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p><p>In image processing, wavelet transform (WT) offers multiscale image decomposition, generating a blend of low-resolution approximation images and high-resolution detail components. Drawing parallels to this concept, we view feature maps in convolutional neural networks (CNNs) as a similar mix, but uniquely within the channel domain. Inspired by multitask learning (MTL) principles, we propose a wavelet-based dual-task (WDT) framework. This novel framework employs WT in the channel domain to split a single task into two parallel tasks, thereby reforming traditional single-task CNNs into dynamic dual-task networks. Our WDT framework integrates seamlessly with various popular network architectures, enhancing their versatility and efficiency. It offers a more rational approach to resource allocation in CNNs, balancing between low-frequency and high-frequency information. Rigorous experiments on Cifar10, ImageNet, HMDB51, and UCF101 validate our approach's effectiveness. Results reveal significant improvements in the performance of traditional CNNs on classification tasks, and notably, these enhancements are achieved with fewer parameters and computations. In summary, our work presents a pioneering step toward redefining the performance and efficiency of CNN-based tasks through WT.</p>\",\"PeriodicalId\":13303,\"journal\":{\"name\":\"IEEE transactions on neural networks and learning systems\",\"volume\":\"PP \",\"pages\":\"\"},\"PeriodicalIF\":10.2000,\"publicationDate\":\"2024-11-06\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE transactions on neural networks and learning systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/10.1109/TNNLS.2024.3486330\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE transactions on neural networks and learning systems","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.1109/TNNLS.2024.3486330","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

在图像处理中,小波变换(WT)提供多尺度图像分解,生成低分辨率近似图像和高分辨率细节成分的混合图像。与这一概念相似,我们将卷积神经网络(CNN)中的特征图视为类似的混合,但在通道域中是唯一的。受多任务学习(MTL)原理的启发,我们提出了基于小波的双任务(WDT)框架。这种新颖的框架在信道域中采用小波技术,将单一任务拆分为两个并行任务,从而将传统的单任务 CNN 改造为动态双任务网络。我们的 WDT 框架可与各种流行的网络架构无缝集成,从而提高其通用性和效率。它为 CNN 的资源分配提供了一种更合理的方法,在低频和高频信息之间实现了平衡。在 Cifar10、ImageNet、HMDB51 和 UCF101 上进行的严格实验验证了我们方法的有效性。实验结果表明,传统 CNN 在分类任务上的性能有了明显改善,而且值得注意的是,这些改进是在参数和计算量较少的情况下实现的。总之,我们的工作为通过 WT 重新定义基于 CNN 的任务的性能和效率迈出了开创性的一步。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Wavelet-Based Dual-Task Network.

In image processing, wavelet transform (WT) offers multiscale image decomposition, generating a blend of low-resolution approximation images and high-resolution detail components. Drawing parallels to this concept, we view feature maps in convolutional neural networks (CNNs) as a similar mix, but uniquely within the channel domain. Inspired by multitask learning (MTL) principles, we propose a wavelet-based dual-task (WDT) framework. This novel framework employs WT in the channel domain to split a single task into two parallel tasks, thereby reforming traditional single-task CNNs into dynamic dual-task networks. Our WDT framework integrates seamlessly with various popular network architectures, enhancing their versatility and efficiency. It offers a more rational approach to resource allocation in CNNs, balancing between low-frequency and high-frequency information. Rigorous experiments on Cifar10, ImageNet, HMDB51, and UCF101 validate our approach's effectiveness. Results reveal significant improvements in the performance of traditional CNNs on classification tasks, and notably, these enhancements are achieved with fewer parameters and computations. In summary, our work presents a pioneering step toward redefining the performance and efficiency of CNN-based tasks through WT.

求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
IEEE transactions on neural networks and learning systems
IEEE transactions on neural networks and learning systems COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE-COMPUTER SCIENCE, HARDWARE & ARCHITECTURE
CiteScore
23.80
自引率
9.60%
发文量
2102
审稿时长
3-8 weeks
期刊介绍: The focus of IEEE Transactions on Neural Networks and Learning Systems is to present scholarly articles discussing the theory, design, and applications of neural networks as well as other learning systems. The journal primarily highlights technical and scientific research in this domain.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信