{"title":"关于检验假设中的一个有用不等式","authors":"M. Burnashev","doi":"10.1109/18.681348","DOIUrl":null,"url":null,"abstract":"A simple proof of one probabilistic inequality is presented. Let P and Q be two given probability measures on a measurable space (/spl Xscr/, /spl Ascr/). We consider testing of hypotheses P and Q using one observation. For an arbitrary decision rule, let /spl alpha/ and /spl beta/ denote the two kinds of error probabilities. If both error probabilities have equal costs (or we want to minimize the maximum of them) then it is natural to investigate the minimal possible sum inf{/spl alpha/+/spl beta/} for the best decision rule.","PeriodicalId":13250,"journal":{"name":"IEEE Trans. Inf. Theory","volume":"395 1","pages":"1668-1670"},"PeriodicalIF":0.0000,"publicationDate":"1998-07-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":"{\"title\":\"On One Useful Inequality in Testing of Hypotheses\",\"authors\":\"M. Burnashev\",\"doi\":\"10.1109/18.681348\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A simple proof of one probabilistic inequality is presented. Let P and Q be two given probability measures on a measurable space (/spl Xscr/, /spl Ascr/). We consider testing of hypotheses P and Q using one observation. For an arbitrary decision rule, let /spl alpha/ and /spl beta/ denote the two kinds of error probabilities. If both error probabilities have equal costs (or we want to minimize the maximum of them) then it is natural to investigate the minimal possible sum inf{/spl alpha/+/spl beta/} for the best decision rule.\",\"PeriodicalId\":13250,\"journal\":{\"name\":\"IEEE Trans. Inf. Theory\",\"volume\":\"395 1\",\"pages\":\"1668-1670\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"1998-07-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"6\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Trans. Inf. Theory\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/18.681348\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Trans. Inf. Theory","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/18.681348","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A simple proof of one probabilistic inequality is presented. Let P and Q be two given probability measures on a measurable space (/spl Xscr/, /spl Ascr/). We consider testing of hypotheses P and Q using one observation. For an arbitrary decision rule, let /spl alpha/ and /spl beta/ denote the two kinds of error probabilities. If both error probabilities have equal costs (or we want to minimize the maximum of them) then it is natural to investigate the minimal possible sum inf{/spl alpha/+/spl beta/} for the best decision rule.