Robust linear dimensionality reduction for hypothesis testing with application to sensor selection

D. Bajović, B. Sinopoli, Joao Xavier
{"title":"Robust linear dimensionality reduction for hypothesis testing with application to sensor selection","authors":"D. Bajović, B. Sinopoli, Joao Xavier","doi":"10.1109/ALLERTON.2009.5394788","DOIUrl":null,"url":null,"abstract":"This paper addresses robust linear dimensionality reduction (RLDR) for binary Gaussian hypothesis testing. The goal is to find a linear map from the high dimensional space where the data vector lives to a low dimensional space where the hypothesis test is carried out. The linear map is designed to maximize the detector performance. This translates into maximizing the Kullback-Leibler (KL) distance between the two projected distributions. In practice, the distribution parameters are estimated from training data, thus subject to uncertainty. This is modeled by allowing the distribution parameters to drift within some confidence regions. We address the case where only the mean values of the Gaussian distributions, m0 and m1, are uncertain with confidence ellipsoids defined by the corresponding covariance matrices, S0 and S1. Under this setup, we find the linear map that maximizes the KL distance for the worst case drift of the mean values. We solve the problem globally for the case of linear mapping to one dimension, reducing it to a grid search over a finite interval. Our solution shows superior performance compared to robust linear discriminant analysis techniques recently proposed in the literature. In addition, we use our RLDR solution as a building block to derive a sensor selection algorithm for robust event detection, in the context of sensor networks. Our sensor selection algorithm shows quasi-optimal performance: worst-case KL distance for suboptimal sensor selection is at most 15% smaller than worst-case KL distance for the optimal sensor selection obtained by exhaustive search.","PeriodicalId":440015,"journal":{"name":"2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","volume":"192 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-09-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 47th Annual Allerton Conference on Communication, Control, and Computing (Allerton)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ALLERTON.2009.5394788","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

This paper addresses robust linear dimensionality reduction (RLDR) for binary Gaussian hypothesis testing. The goal is to find a linear map from the high dimensional space where the data vector lives to a low dimensional space where the hypothesis test is carried out. The linear map is designed to maximize the detector performance. This translates into maximizing the Kullback-Leibler (KL) distance between the two projected distributions. In practice, the distribution parameters are estimated from training data, thus subject to uncertainty. This is modeled by allowing the distribution parameters to drift within some confidence regions. We address the case where only the mean values of the Gaussian distributions, m0 and m1, are uncertain with confidence ellipsoids defined by the corresponding covariance matrices, S0 and S1. Under this setup, we find the linear map that maximizes the KL distance for the worst case drift of the mean values. We solve the problem globally for the case of linear mapping to one dimension, reducing it to a grid search over a finite interval. Our solution shows superior performance compared to robust linear discriminant analysis techniques recently proposed in the literature. In addition, we use our RLDR solution as a building block to derive a sensor selection algorithm for robust event detection, in the context of sensor networks. Our sensor selection algorithm shows quasi-optimal performance: worst-case KL distance for suboptimal sensor selection is at most 15% smaller than worst-case KL distance for the optimal sensor selection obtained by exhaustive search.
假设检验的鲁棒线性降维及其在传感器选择中的应用
本文研究了二值高斯假设检验的鲁棒线性降维(RLDR)。目标是从数据向量所在的高维空间到进行假设检验的低维空间找到一个线性映射。线性图的设计是为了最大限度地提高检测器的性能。这转化为最大化两个投影分布之间的Kullback-Leibler (KL)距离。在实际应用中,分布参数是根据训练数据估计的,存在不确定性。这是通过允许分布参数在一些置信区域内漂移来建模的。我们处理的情况下,只有高斯分布的平均值,m0和m1,是不确定的置信椭球由相应的协方差矩阵,S0和S1定义。在这种设置下,我们找到了在最坏情况下使平均值漂移的KL距离最大化的线性映射。我们解决了一维线性映射的全局问题,将其简化为有限区间内的网格搜索。与文献中最近提出的鲁棒线性判别分析技术相比,我们的解决方案表现出优越的性能。此外,我们使用我们的RLDR解决方案作为构建块,在传感器网络的背景下推导出用于鲁棒事件检测的传感器选择算法。我们的传感器选择算法具有准最优性能:次优传感器选择的最坏情况KL距离比穷举搜索获得的最优传感器选择的最坏情况KL距离最多小15%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信