{"title":"考虑到统计学中的距离度量","authors":"C. Kitsos, C. Nisiotis","doi":"10.2478/bile-2022-0006","DOIUrl":null,"url":null,"abstract":"Summary The target of this paper is to offer a compact review of the so called distance methods in Statistics, which cover all the known estimation methods. Based on this fact we propose a new step, to adopt from Information Theory, the divergence measures, as distance methods, to compare two distributions, and not only to investigate if the means or the variances of the distributions are equal. Some useful results towards this line of thought are presented, adopting a compact form for all known divergence measures, and are appropriately analyzed for Biometrical, and not only, applications.","PeriodicalId":8933,"journal":{"name":"Biometrical Letters","volume":"42 1","pages":"65 - 75"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Considering distance measures in Statistics\",\"authors\":\"C. Kitsos, C. Nisiotis\",\"doi\":\"10.2478/bile-2022-0006\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Summary The target of this paper is to offer a compact review of the so called distance methods in Statistics, which cover all the known estimation methods. Based on this fact we propose a new step, to adopt from Information Theory, the divergence measures, as distance methods, to compare two distributions, and not only to investigate if the means or the variances of the distributions are equal. Some useful results towards this line of thought are presented, adopting a compact form for all known divergence measures, and are appropriately analyzed for Biometrical, and not only, applications.\",\"PeriodicalId\":8933,\"journal\":{\"name\":\"Biometrical Letters\",\"volume\":\"42 1\",\"pages\":\"65 - 75\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Biometrical Letters\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.2478/bile-2022-0006\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Biometrical Letters","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.2478/bile-2022-0006","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Summary The target of this paper is to offer a compact review of the so called distance methods in Statistics, which cover all the known estimation methods. Based on this fact we propose a new step, to adopt from Information Theory, the divergence measures, as distance methods, to compare two distributions, and not only to investigate if the means or the variances of the distributions are equal. Some useful results towards this line of thought are presented, adopting a compact form for all known divergence measures, and are appropriately analyzed for Biometrical, and not only, applications.