High‐dimensional differential networks with sparsity and reduced‐rank

Pub Date : 2024-05-13 DOI:10.1002/sta4.690
Yao Wang, Cheng Wang, Binyan Jiang
{"title":"High‐dimensional differential networks with sparsity and reduced‐rank","authors":"Yao Wang, Cheng Wang, Binyan Jiang","doi":"10.1002/sta4.690","DOIUrl":null,"url":null,"abstract":"Differential network analysis plays a crucial role in capturing nuanced changes in conditional correlations between two samples. Under the high‐dimensional setting, the differential network, that is, the difference between the two precision matrices are usually stylized with sparse signals and some low‐rank latent factors. Recognizing the distinctions inherent in the precision matrices of such networks, we introduce a novel approach, termed ‘SR‐Network’ for the estimation of sparse and reduced‐rank differential networks. This method directly assesses the differential network by formulating a convex empirical loss function with ‐norm and nuclear norm penalties. The study establishes finite‐sample error bounds for parameter estimation and highlights the superior performance of the proposed method through extensive simulations and real data studies. This research significantly contributes to the advancement of methodologies for accurate analysis of differential networks, particularly in the context of structures characterized by sparsity and low‐rank features.","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2024-05-13","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1002/sta4.690","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Differential network analysis plays a crucial role in capturing nuanced changes in conditional correlations between two samples. Under the high‐dimensional setting, the differential network, that is, the difference between the two precision matrices are usually stylized with sparse signals and some low‐rank latent factors. Recognizing the distinctions inherent in the precision matrices of such networks, we introduce a novel approach, termed ‘SR‐Network’ for the estimation of sparse and reduced‐rank differential networks. This method directly assesses the differential network by formulating a convex empirical loss function with ‐norm and nuclear norm penalties. The study establishes finite‐sample error bounds for parameter estimation and highlights the superior performance of the proposed method through extensive simulations and real data studies. This research significantly contributes to the advancement of methodologies for accurate analysis of differential networks, particularly in the context of structures characterized by sparsity and low‐rank features.
分享
查看原文
具有稀疏性和降低秩的高维微分网络
差分网络分析在捕捉两个样本之间条件相关性的细微变化方面起着至关重要的作用。在高维环境下,差分网络(即两个精度矩阵之间的差异)通常由稀疏信号和一些低阶潜因构成。认识到此类网络精度矩阵的内在区别,我们引入了一种新方法,称为 "SR-网络",用于估计稀疏和低阶差分网络。这种方法通过制定带有-规范和核规范惩罚的凸经验损失函数,直接评估差分网络。该研究为参数估计建立了有限样本误差边界,并通过大量模拟和真实数据研究凸显了所提方法的优越性能。这项研究极大地促进了差分网络精确分析方法的发展,尤其是在具有稀疏性和低秩特征的结构中。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信