Ground-Truth Free Meta-Learning for Deep Compressive Sampling

Xinran Qin, Yuhui Quan, T. Pang, Hui Ji
{"title":"Ground-Truth Free Meta-Learning for Deep Compressive Sampling","authors":"Xinran Qin, Yuhui Quan, T. Pang, Hui Ji","doi":"10.1109/CVPR52729.2023.00959","DOIUrl":null,"url":null,"abstract":"Compressive sampling (CS) is an efficient technique for imaging. This paper proposes a ground-truth (GT) free meta-learning method for CS, which leverages both ex-ternal and internal deep learning for unsupervised high-quality image reconstruction. The proposed method first trains a deep neural network (NN) via external meta-learning using only CS measurements, and then efficiently adapts the trained model to a test sample for exploiting sample-specific internal characteristic for performance gain. The meta-learning and model adaptation are built on an improved Stein's unbiased risk estimator (iSURE) that provides efficient computation and effective guidance for accurate prediction in the range space of the adjoint of the measurement matrix. To improve the learning and adaption on the null space of the measurement matrix, a modi-fied model-agnostic meta-learning scheme and a null-space consistency loss are proposed. In addition, a bias tuning scheme for unrolling NNs is introduced for further acceler-ation of model adaption. Experimental results have demonstrated that the proposed GT-free method performs well and can even compete with supervised methods.","PeriodicalId":376416,"journal":{"name":"2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CVPR52729.2023.00959","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Compressive sampling (CS) is an efficient technique for imaging. This paper proposes a ground-truth (GT) free meta-learning method for CS, which leverages both ex-ternal and internal deep learning for unsupervised high-quality image reconstruction. The proposed method first trains a deep neural network (NN) via external meta-learning using only CS measurements, and then efficiently adapts the trained model to a test sample for exploiting sample-specific internal characteristic for performance gain. The meta-learning and model adaptation are built on an improved Stein's unbiased risk estimator (iSURE) that provides efficient computation and effective guidance for accurate prediction in the range space of the adjoint of the measurement matrix. To improve the learning and adaption on the null space of the measurement matrix, a modi-fied model-agnostic meta-learning scheme and a null-space consistency loss are proposed. In addition, a bias tuning scheme for unrolling NNs is introduced for further acceler-ation of model adaption. Experimental results have demonstrated that the proposed GT-free method performs well and can even compete with supervised methods.
深度压缩采样的无真元学习
压缩采样(CS)是一种有效的成像技术。本文提出了一种无基础真理(GT)的CS元学习方法,该方法利用外部和内部深度学习进行无监督的高质量图像重建。该方法首先通过仅使用CS测量值的外部元学习训练深度神经网络(NN),然后将训练好的模型有效地适应于测试样本,以利用样本特定的内部特征来获得性能增益。元学习和模型自适应是建立在改进的Stein's无偏风险估计器(iSURE)上的,它提供了高效的计算和有效的指导,以便在测量矩阵伴随的范围空间中进行准确的预测。为了提高测量矩阵对零空间的学习和自适应能力,提出了一种改进的模型不可知元学习方案和零空间一致性损失。此外,为了进一步加速模型的自适应,还引入了一种用于展开神经网络的偏置调谐方案。实验结果表明,该方法性能良好,甚至可以与有监督方法相媲美。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信