{"title":"关于协方差估计中的归一化信噪比","authors":"Tzvi Diskin, Ami Wiesel","doi":"arxiv-2409.10896","DOIUrl":null,"url":null,"abstract":"We address the Normalized Signal to Noise Ratio (NSNR) metric defined in the\nseminal paper by Reed, Mallett and Brennan on adaptive detection. The setting\nis detection of a target vector in additive correlated noise. NSNR is the ratio\nbetween the SNR of a linear detector which uses an estimated noise covariance\nand the SNR of clairvoyant detector based on the exact unknown covariance. It\nis not obvious how to evaluate NSNR since it is a function of the target\nvector. To close this gap, we consider the NSNR associated with the worst\ntarget. Using the Kantorovich Inequality, we provide a closed form solution for\nthe worst case NSNR. Then, we prove that the classical Gaussian Kullback\nLeibler (KL) divergence bounds it. Numerical experiments with different true\ncovariances and various estimates also suggest that the KL metric is more\ncorrelated with the NSNR metric than competing norm based metrics.","PeriodicalId":501034,"journal":{"name":"arXiv - EE - Signal Processing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"On the normalized signal to noise ratio in covariance estimation\",\"authors\":\"Tzvi Diskin, Ami Wiesel\",\"doi\":\"arxiv-2409.10896\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We address the Normalized Signal to Noise Ratio (NSNR) metric defined in the\\nseminal paper by Reed, Mallett and Brennan on adaptive detection. The setting\\nis detection of a target vector in additive correlated noise. NSNR is the ratio\\nbetween the SNR of a linear detector which uses an estimated noise covariance\\nand the SNR of clairvoyant detector based on the exact unknown covariance. It\\nis not obvious how to evaluate NSNR since it is a function of the target\\nvector. To close this gap, we consider the NSNR associated with the worst\\ntarget. Using the Kantorovich Inequality, we provide a closed form solution for\\nthe worst case NSNR. Then, we prove that the classical Gaussian Kullback\\nLeibler (KL) divergence bounds it. Numerical experiments with different true\\ncovariances and various estimates also suggest that the KL metric is more\\ncorrelated with the NSNR metric than competing norm based metrics.\",\"PeriodicalId\":501034,\"journal\":{\"name\":\"arXiv - EE - Signal Processing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"arXiv - EE - Signal Processing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/arxiv-2409.10896\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - EE - Signal Processing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.10896","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
On the normalized signal to noise ratio in covariance estimation
We address the Normalized Signal to Noise Ratio (NSNR) metric defined in the
seminal paper by Reed, Mallett and Brennan on adaptive detection. The setting
is detection of a target vector in additive correlated noise. NSNR is the ratio
between the SNR of a linear detector which uses an estimated noise covariance
and the SNR of clairvoyant detector based on the exact unknown covariance. It
is not obvious how to evaluate NSNR since it is a function of the target
vector. To close this gap, we consider the NSNR associated with the worst
target. Using the Kantorovich Inequality, we provide a closed form solution for
the worst case NSNR. Then, we prove that the classical Gaussian Kullback
Leibler (KL) divergence bounds it. Numerical experiments with different true
covariances and various estimates also suggest that the KL metric is more
correlated with the NSNR metric than competing norm based metrics.