{"title":"聚类的信息切割和信息力","authors":"R. Jenssen, J. Príncipe, T. Eltoft","doi":"10.1109/NNSP.2003.1318045","DOIUrl":null,"url":null,"abstract":"We define an information-theoretic divergence measure between probability density functions (pdfs) that has a deep connection to the cut in graph-theory. This connection is revealed when the pdfs are estimated by the Parzen method with a Gaussian kernel. We refer to our divergence measure as the information cut. The information cut provides us with a theoretically sound criterion for cluster evaluation. In this paper we show that it can be used to merge clusters. The initial clusters are obtained based on the related concept of information forces. We create directed trees by selecting the predecessor of a node (pattern) according to the direction of the information force acting on the pattern. Each directed tree corresponds to a cluster, hence enabling us to obtain an initial partitioning of the data set. Subsequently, we utilize the information cut as a cluster evaluation function to merge clusters until the predefined number of clusters is reached. We demonstrate the performance of our novel information-theoretic clustering method when applied to both artificially created data and real data, with encouraging results.","PeriodicalId":315958,"journal":{"name":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","volume":"11 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"Information cut and information forces for clustering\",\"authors\":\"R. Jenssen, J. Príncipe, T. Eltoft\",\"doi\":\"10.1109/NNSP.2003.1318045\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We define an information-theoretic divergence measure between probability density functions (pdfs) that has a deep connection to the cut in graph-theory. This connection is revealed when the pdfs are estimated by the Parzen method with a Gaussian kernel. We refer to our divergence measure as the information cut. The information cut provides us with a theoretically sound criterion for cluster evaluation. In this paper we show that it can be used to merge clusters. The initial clusters are obtained based on the related concept of information forces. We create directed trees by selecting the predecessor of a node (pattern) according to the direction of the information force acting on the pattern. Each directed tree corresponds to a cluster, hence enabling us to obtain an initial partitioning of the data set. Subsequently, we utilize the information cut as a cluster evaluation function to merge clusters until the predefined number of clusters is reached. We demonstrate the performance of our novel information-theoretic clustering method when applied to both artificially created data and real data, with encouraging results.\",\"PeriodicalId\":315958,\"journal\":{\"name\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"volume\":\"11 4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/NNSP.2003.1318045\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2003 IEEE XIII Workshop on Neural Networks for Signal Processing (IEEE Cat. No.03TH8718)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NNSP.2003.1318045","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Information cut and information forces for clustering
We define an information-theoretic divergence measure between probability density functions (pdfs) that has a deep connection to the cut in graph-theory. This connection is revealed when the pdfs are estimated by the Parzen method with a Gaussian kernel. We refer to our divergence measure as the information cut. The information cut provides us with a theoretically sound criterion for cluster evaluation. In this paper we show that it can be used to merge clusters. The initial clusters are obtained based on the related concept of information forces. We create directed trees by selecting the predecessor of a node (pattern) according to the direction of the information force acting on the pattern. Each directed tree corresponds to a cluster, hence enabling us to obtain an initial partitioning of the data set. Subsequently, we utilize the information cut as a cluster evaluation function to merge clusters until the predefined number of clusters is reached. We demonstrate the performance of our novel information-theoretic clustering method when applied to both artificially created data and real data, with encouraging results.