IEEE Signal Processing Magazine最新文献

筛选
英文 中文
Rapture of the Deep: Highs and lows of sparsity in a world of depths 深渊的狂喜:深渊世界中稀疏的高低
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2025.3611564
Rémi Gribonval;Elisa Riccietti;Quoc-Tung Le;Léon Zheng
{"title":"Rapture of the Deep: Highs and lows of sparsity in a world of depths","authors":"Rémi Gribonval;Elisa Riccietti;Quoc-Tung Le;Léon Zheng","doi":"10.1109/MSP.2025.3611564","DOIUrl":"https://doi.org/10.1109/MSP.2025.3611564","url":null,"abstract":"Promoting sparsity in deep networks is a natural way to control their complexity, and it is a timely endeavor since practical neural model sizes have grown to unprecedented levels. The lessons from sparsity in linear inverse problems also bear the promise of many other benefits beyond such computational aspects, from statistical significance to explainability. Can these promises be fulfilled? Can we safely leverage the know-how of sparsity-promoting regularizers for inverse problems to harness sparsity in deeper contexts, linear or not? This article surveys the curses and blessings of deep sparsity. After a reminder on the main lessons from inverse problems, we tour a number of results that challenge their immediate deep extensions, from both a mathematical and a computational perspective. In particular, we highlight that <inline-formula><tex-math>${mathit{ell}}^{1}$</tex-math></inline-formula> regularization does not always lead to sparsity, and that optimization with a prescribed set of allowed nonzero coefficients can be NP-hard. We emphasize the role of rescaling invariances in these phenomena and the need to favor structured sparsity to keep sparse network training problems under control, ensure their stability, and actually enable efficient network implementations on GPUs. We finally outline the promises and challenges of a flexible family of <italic>Kronecker sparsity structures</i>, which extend the classical butterfly structure and appear in many classical scientific computing applications and that have also recently emerged in deep learning.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"10-23"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665344","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SPS Advance Your Career SPS促进你的职业发展
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3676841
{"title":"SPS Advance Your Career","authors":"","doi":"10.1109/MSP.2026.3676841","DOIUrl":"https://doi.org/10.1109/MSP.2026.3676841","url":null,"abstract":"","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"C3-C3"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480034","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665475","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SPS Podcast SPS播客
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3676840
{"title":"SPS Podcast","authors":"","doi":"10.1109/MSP.2026.3676840","DOIUrl":"https://doi.org/10.1109/MSP.2026.3676840","url":null,"abstract":"","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"123-123"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480033","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665538","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Probabilistic Graphical Models Meet Deep Learning for Semantic Segmentation: Mathematical connections and recent developments 概率图模型满足语义分割的深度学习:数学联系和最新发展
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2025.3648958
Martina Pastorino;Gabriele Moser;Sebastiano B. Serpico;Josiane Zerubia
{"title":"Probabilistic Graphical Models Meet Deep Learning for Semantic Segmentation: Mathematical connections and recent developments","authors":"Martina Pastorino;Gabriele Moser;Sebastiano B. Serpico;Josiane Zerubia","doi":"10.1109/MSP.2025.3648958","DOIUrl":"https://doi.org/10.1109/MSP.2025.3648958","url":null,"abstract":"Semantic segmentation, also known as <italic>spatially dense image classification</i>, plays a crucial role in image analysis, bridging the fields of image processing and machine learning. It has wide applications, ranging from land cover mapping in Earth observation to medical diagnostics using biomedical images, fault detection in industrial imagery, and so on. This article focuses on the mathematical connections between two pivotal families of methodological approaches—probabilistic graphical models (PGMs) and deep learning (DL)—and explores the potential of their integration for semantic segmentation tasks. After providing a comprehensive overview of state-of-the-art techniques from both families, the article highlights recent developments that combine these approaches, either through theoretical equivalence or direct integration. Examples of results are provided for renowned benchmark datasets in computer vision and remote sensing, and the article concludes with a discussion of promising future research directions.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"51-63"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665289","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
The Nasreddin Hodja Principle and the Mathematics of Deep Learning [From the Editor] Nasreddin Hodja原理与深度学习数学
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3676148
Tülay Adali
{"title":"The Nasreddin Hodja Principle and the Mathematics of Deep Learning [From the Editor]","authors":"Tülay Adali","doi":"10.1109/MSP.2026.3676148","DOIUrl":"https://doi.org/10.1109/MSP.2026.3676148","url":null,"abstract":"","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"3-4"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480039","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665337","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Continual Learning Through the Lens of Adaptive Filtering: A mathematical tutorial 通过自适应滤波镜头的持续学习:数学教程
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3657516
Liangzu Peng;René Vidal
{"title":"Continual Learning Through the Lens of Adaptive Filtering: A mathematical tutorial","authors":"Liangzu Peng;René Vidal","doi":"10.1109/MSP.2026.3657516","DOIUrl":"https://doi.org/10.1109/MSP.2026.3657516","url":null,"abstract":"Continual learning refers to the problem of learning multiple tasks presented sequentially to the learner without forgetting previously learned tasks. Recently, many deep learning-based approaches have been proposed for continual learning; however, the mathematical foundations behind existing continual learning methods remain underdeveloped. On the other hand, adaptive filtering is a classic subject in signal processing with a rich history of mathematically principled methods. However, its role in understanding the foundations of continual learning has been underappreciated. In this tutorial, we review the basic principles behind both continual learning and adaptive filtering and present a comparative analysis that highlights multiple connections between them. These connections allow us to enhance the mathematical foundations of continual learning based on existing results for adaptive filtering, extend adaptive filtering insights using existing continual learning methods, and discuss a few research directions for continual learning suggested by the historical developments in adaptive filtering.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"24-36"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665476","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
IEEE Feedback IEEE反馈
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3677929
{"title":"IEEE Feedback","authors":"","doi":"10.1109/MSP.2026.3677929","DOIUrl":"https://doi.org/10.1109/MSP.2026.3677929","url":null,"abstract":"","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"4-4"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480043","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665553","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
ICIP ICIP
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2026.3678843
{"title":"ICIP","authors":"","doi":"10.1109/MSP.2026.3678843","DOIUrl":"https://doi.org/10.1109/MSP.2026.3678843","url":null,"abstract":"","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"C2-C2"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://ieeexplore.ieee.org/stamp/stamp.jsp?tp=&arnumber=11480041","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665469","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Flow-Based Generative Models as Iterative Algorithms in Probability Space: An intuitive mathematical framework [Special Issue on the Mathematics of Deep Learning] 基于流的生成模型作为概率空间中的迭代算法:一个直观的数学框架[深度学习数学专刊]
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-13 DOI: 10.1109/MSP.2025.3609527
Yao Xie;Xiuyuan Cheng
{"title":"Flow-Based Generative Models as Iterative Algorithms in Probability Space: An intuitive mathematical framework [Special Issue on the Mathematics of Deep Learning]","authors":"Yao Xie;Xiuyuan Cheng","doi":"10.1109/MSP.2025.3609527","DOIUrl":"https://doi.org/10.1109/MSP.2025.3609527","url":null,"abstract":"Flow-based generative models have emerged as a powerful class of deep generative models, offering exact likelihood estimation, invertible sample transformations, and reliable and efficient sampling, making them particularly well-suited for applications in signal processing, anomaly detection, and structured data synthesis. Unlike diffusion models, which rely on stochastic differential equations (SDEs) for progressive denoising, flow-based models define deterministic transformations governed by ordinary differential equations (ODEs), allowing for faster inference and interpretable probabilistic modeling. This tutorial presents a rigorous mathematical framework for flow-based generative models, positioning them as iterative algorithms in probability space and exploring their connections to optimal transport and Wasserstein gradient flows. We discuss key algorithmic insights, including continuous normalizing flows (CNFs), flow-matching (FM), and distributionally robust optimization (DRO), which enable efficient, high-dimensional generative modeling with theoretical guarantees. We also examine the role of flow-based models in progressive training schemes, demonstrating their convergence properties and generative guarantees under structured optimization frameworks. By bridging mathematical theory with practical implementation, this tutorial aims to provide researchers and practitioners with a comprehensive foundation in flow-based generative modeling and its applications in signal processing and beyond.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"37-50"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147665512","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Random Matrix Theory for Deep Learning: Beyond Eigenvalues of Linear Models [Special Issue on the Mathematics of Deep Learning] 深度学习的随机矩阵理论:超越线性模型的特征值[深度学习数学专刊]
IF 9.6 1区 工程技术
IEEE Signal Processing Magazine Pub Date : 2026-03-01 Epub Date: 2026-04-14 DOI: 10.1109/MSP.2025.3618012
Zhenyu Liao;Michael W. Mahoney
{"title":"Random Matrix Theory for Deep Learning: Beyond Eigenvalues of Linear Models [Special Issue on the Mathematics of Deep Learning]","authors":"Zhenyu Liao;Michael W. Mahoney","doi":"10.1109/MSP.2025.3618012","DOIUrl":"https://doi.org/10.1109/MSP.2025.3618012","url":null,"abstract":"Modern machine learning (ML) and deep neural networks (DNNs) often operate on high-dimensional data and rely on overparameterized models, where classical low-dimensional intuitions break down. In particular, the proportional regime where the data dimension, sample size, and number of model parameters are all large and comparable, gives rise to novel and sometimes counterintuitive behaviors. This article extends traditional random matrix theory (RMT) beyond eigenvalue-based analysis of linear models to address the challenges posed by nonlinear ML models such as DNNs in this regime. We introduce the concept of high-dimensional equivalent, which unifies and generalizes both deterministic equivalent and linear equivalent, to systematically address three technical challenges: high dimensionality, nonlinearity, and the need to analyze generic eigenspectral functionals. Leveraging this framework, we provide precise characterizations of the training and generalization performance of linear models, nonlinear shallow networks, and deep networks. Our results capture rich phenomena, including scaling laws, double descent, and nonlinear learning dynamics, offering a unified perspective on the theoretical understanding of deep learning in high dimensions.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"43 2","pages":"93-106"},"PeriodicalIF":9.6,"publicationDate":"2026-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"147696206","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信
小红书