{"title":"基于最小误差熵的有下界共信道干扰信道估计器","authors":"V. Bhatia, B. Mulgrew","doi":"10.1109/SPCOM.2004.1458353","DOIUrl":null,"url":null,"abstract":"Extensive work to develop and optimize signal processing for signals that are corrupted by additive Gaussian noise has been done so far mainly because of the central limit theorem and the ease in analytic manipulations. It has been observed that the algorithms designed for Gaussian noise typically perform poor in presence of non-Gaussian noise. This paper discusses an error entropy minimization algorithm using kernel density estimates to improve channel estimation in non-Gaussian noise environment. Entropy is a measure of average information contained in a given probability density function. This probability density is assumed unknown and is estimated by using kernel density estimator. Thereby combining entropy cost function with kernel density estimate provides a robust channel estimator in presence of co-channel interference. New lower bounds for co-channel interference effected channel in presence of Gaussian noise are presented as a measure of performance. The simulations for channel estimator in co-channel interference plus Gaussian noise effected channel confirms that a better estimate can be obtained by using the proposed technique as compared to the traditional least squares algorithm, which is considered optimal in the Gaussian noise environments.","PeriodicalId":424981,"journal":{"name":"2004 International Conference on Signal Processing and Communications, 2004. SPCOM '04.","volume":"37 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2004-12-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"A minimum error entropy based channel estimator in presence of co-channel interference with lower bounds\",\"authors\":\"V. Bhatia, B. Mulgrew\",\"doi\":\"10.1109/SPCOM.2004.1458353\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Extensive work to develop and optimize signal processing for signals that are corrupted by additive Gaussian noise has been done so far mainly because of the central limit theorem and the ease in analytic manipulations. It has been observed that the algorithms designed for Gaussian noise typically perform poor in presence of non-Gaussian noise. This paper discusses an error entropy minimization algorithm using kernel density estimates to improve channel estimation in non-Gaussian noise environment. Entropy is a measure of average information contained in a given probability density function. This probability density is assumed unknown and is estimated by using kernel density estimator. Thereby combining entropy cost function with kernel density estimate provides a robust channel estimator in presence of co-channel interference. New lower bounds for co-channel interference effected channel in presence of Gaussian noise are presented as a measure of performance. The simulations for channel estimator in co-channel interference plus Gaussian noise effected channel confirms that a better estimate can be obtained by using the proposed technique as compared to the traditional least squares algorithm, which is considered optimal in the Gaussian noise environments.\",\"PeriodicalId\":424981,\"journal\":{\"name\":\"2004 International Conference on Signal Processing and Communications, 2004. SPCOM '04.\",\"volume\":\"37 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2004-12-11\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2004 International Conference on Signal Processing and Communications, 2004. SPCOM '04.\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SPCOM.2004.1458353\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2004 International Conference on Signal Processing and Communications, 2004. SPCOM '04.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SPCOM.2004.1458353","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A minimum error entropy based channel estimator in presence of co-channel interference with lower bounds
Extensive work to develop and optimize signal processing for signals that are corrupted by additive Gaussian noise has been done so far mainly because of the central limit theorem and the ease in analytic manipulations. It has been observed that the algorithms designed for Gaussian noise typically perform poor in presence of non-Gaussian noise. This paper discusses an error entropy minimization algorithm using kernel density estimates to improve channel estimation in non-Gaussian noise environment. Entropy is a measure of average information contained in a given probability density function. This probability density is assumed unknown and is estimated by using kernel density estimator. Thereby combining entropy cost function with kernel density estimate provides a robust channel estimator in presence of co-channel interference. New lower bounds for co-channel interference effected channel in presence of Gaussian noise are presented as a measure of performance. The simulations for channel estimator in co-channel interference plus Gaussian noise effected channel confirms that a better estimate can be obtained by using the proposed technique as compared to the traditional least squares algorithm, which is considered optimal in the Gaussian noise environments.