{"title":"约束信号:信息内容与检测的一般理论","authors":"M. Stecker","doi":"10.2174/1876825301104010001","DOIUrl":null,"url":null,"abstract":"In this paper, a general theory of signals characterized by probabilistic constraints is developed. As in previous work (10), the theoretical development employs Lagrange multipliers to implement the constraints and the maximum en- tropy principle to generate the most likely probability distribution function consistent with the constraints. The method of computing the probability distribution functions is similar to that used in computing partition functions in statistical me- chanics. Simple cases in which exact analytic solutions for the maximum entropy distribution functions and entropy exist are studied and their implications discussed. The application of this technique to the problem of signal detection is ex- plored both theoretically and with simulations. It is demonstrated that the method can readily classify signals governed by different constraint distributions as long as the mean value of the constraints for the two distributions is different. Classi- fying signals governed by the constraint distributions that differ in shape but not in mean value is much more difficult. Some solutions to this problem and extensions of the method are discussed.","PeriodicalId":147157,"journal":{"name":"The Open Signal Processing Journal","volume":"10 2 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Constrained Signals: A General Theory of Information Content and Detection\",\"authors\":\"M. Stecker\",\"doi\":\"10.2174/1876825301104010001\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"In this paper, a general theory of signals characterized by probabilistic constraints is developed. As in previous work (10), the theoretical development employs Lagrange multipliers to implement the constraints and the maximum en- tropy principle to generate the most likely probability distribution function consistent with the constraints. The method of computing the probability distribution functions is similar to that used in computing partition functions in statistical me- chanics. Simple cases in which exact analytic solutions for the maximum entropy distribution functions and entropy exist are studied and their implications discussed. The application of this technique to the problem of signal detection is ex- plored both theoretically and with simulations. It is demonstrated that the method can readily classify signals governed by different constraint distributions as long as the mean value of the constraints for the two distributions is different. Classi- fying signals governed by the constraint distributions that differ in shape but not in mean value is much more difficult. Some solutions to this problem and extensions of the method are discussed.\",\"PeriodicalId\":147157,\"journal\":{\"name\":\"The Open Signal Processing Journal\",\"volume\":\"10 2 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-04-19\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Open Signal Processing Journal\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2174/1876825301104010001\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Open Signal Processing Journal","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2174/1876825301104010001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Constrained Signals: A General Theory of Information Content and Detection
In this paper, a general theory of signals characterized by probabilistic constraints is developed. As in previous work (10), the theoretical development employs Lagrange multipliers to implement the constraints and the maximum en- tropy principle to generate the most likely probability distribution function consistent with the constraints. The method of computing the probability distribution functions is similar to that used in computing partition functions in statistical me- chanics. Simple cases in which exact analytic solutions for the maximum entropy distribution functions and entropy exist are studied and their implications discussed. The application of this technique to the problem of signal detection is ex- plored both theoretically and with simulations. It is demonstrated that the method can readily classify signals governed by different constraint distributions as long as the mean value of the constraints for the two distributions is different. Classi- fying signals governed by the constraint distributions that differ in shape but not in mean value is much more difficult. Some solutions to this problem and extensions of the method are discussed.