{"title":"基于von Mises-Fisher分布的深度自适应图聚类","authors":"Pengfei Wang, Daqing Wu, Chong Chen, Kunpeng Liu, Yanjie Fu, Jianqiang Huang, Yuanchun Zhou, Jianfeng Zhan, Xiansheng Hua","doi":"https://dl.acm.org/doi/10.1145/3580521","DOIUrl":null,"url":null,"abstract":"<p>Graph clustering has been a hot research topic and is widely used in many fields, such as community detection in social networks. Lots of works combining auto-encoder and graph neural networks have been applied to clustering tasks by utilizing node attributes and graph structure. These works usually assumed the inherent parameters (i.e. size and variance) of different clusters in the latent embedding space are homogeneous, and hence the assigned probability is monotonous over the Euclidean distance between node embeddings and centroids. Unfortunately, this assumption usually does not hold since the size and concentration of different clusters can be quite different, which limits the clustering accuracy. In addition, the node embeddings in deep graph clustering methods are usually L2 normalized so that it lies on the surface of a unit hyper-sphere. To solve this problem, we proposed <underline><b>D</b></underline>eep <underline><b>A</b></underline>daptive <underline><b>G</b></underline>raph <underline><b>C</b></underline>lustering via von Mises-Fisher distributions, namely DAGC. DAGC assumes the node embeddings <b>H</b> can be drawn from a von Mises-Fisher distribution and each cluster <i>k</i> is associated with cluster inherent parameters <b><i>ρ</i></b><sub><i>k</i></sub> which includes cluster center <b><i>μ</i></b> and cluster cohesion degree <i>κ</i>. Then we adopt an EM-like approach (i.e. \\(\\mathcal {P}(\\mathbf {H}|\\mathbf {\\rho }) \\) and \\(\\mathcal {P}(\\mathbf {\\rho }|\\mathbf {H}) \\) respectively) to learn the embedding and cluster inherent parameters alternately. Specifically, with the node embeddings, we proposed to update the cluster centers in an attraction-repulsion manner to make the cluster centers more separable. And given the cluster inherent parameters, a likelihood-based loss is proposed to make node embeddings more concentrated around cluster centers. Thus, DAGC can simultaneously improve the intra-cluster compactness and inter-cluster heterogeneity. Finally, extensive experiments conducted on four benchmark datasets have demonstrated that the proposed DAGC consistently outperforms the state-of-the-art methods, especially on imbalanced datasets.</p>","PeriodicalId":50940,"journal":{"name":"ACM Transactions on the Web","volume":"43 35","pages":""},"PeriodicalIF":2.6000,"publicationDate":"2023-01-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Deep Adaptive Graph Clustering via von Mises-Fisher Distributions\",\"authors\":\"Pengfei Wang, Daqing Wu, Chong Chen, Kunpeng Liu, Yanjie Fu, Jianqiang Huang, Yuanchun Zhou, Jianfeng Zhan, Xiansheng Hua\",\"doi\":\"https://dl.acm.org/doi/10.1145/3580521\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>Graph clustering has been a hot research topic and is widely used in many fields, such as community detection in social networks. Lots of works combining auto-encoder and graph neural networks have been applied to clustering tasks by utilizing node attributes and graph structure. These works usually assumed the inherent parameters (i.e. size and variance) of different clusters in the latent embedding space are homogeneous, and hence the assigned probability is monotonous over the Euclidean distance between node embeddings and centroids. Unfortunately, this assumption usually does not hold since the size and concentration of different clusters can be quite different, which limits the clustering accuracy. In addition, the node embeddings in deep graph clustering methods are usually L2 normalized so that it lies on the surface of a unit hyper-sphere. To solve this problem, we proposed <underline><b>D</b></underline>eep <underline><b>A</b></underline>daptive <underline><b>G</b></underline>raph <underline><b>C</b></underline>lustering via von Mises-Fisher distributions, namely DAGC. DAGC assumes the node embeddings <b>H</b> can be drawn from a von Mises-Fisher distribution and each cluster <i>k</i> is associated with cluster inherent parameters <b><i>ρ</i></b><sub><i>k</i></sub> which includes cluster center <b><i>μ</i></b> and cluster cohesion degree <i>κ</i>. Then we adopt an EM-like approach (i.e. \\\\(\\\\mathcal {P}(\\\\mathbf {H}|\\\\mathbf {\\\\rho }) \\\\) and \\\\(\\\\mathcal {P}(\\\\mathbf {\\\\rho }|\\\\mathbf {H}) \\\\) respectively) to learn the embedding and cluster inherent parameters alternately. Specifically, with the node embeddings, we proposed to update the cluster centers in an attraction-repulsion manner to make the cluster centers more separable. And given the cluster inherent parameters, a likelihood-based loss is proposed to make node embeddings more concentrated around cluster centers. Thus, DAGC can simultaneously improve the intra-cluster compactness and inter-cluster heterogeneity. Finally, extensive experiments conducted on four benchmark datasets have demonstrated that the proposed DAGC consistently outperforms the state-of-the-art methods, especially on imbalanced datasets.</p>\",\"PeriodicalId\":50940,\"journal\":{\"name\":\"ACM Transactions on the Web\",\"volume\":\"43 35\",\"pages\":\"\"},\"PeriodicalIF\":2.6000,\"publicationDate\":\"2023-01-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACM Transactions on the Web\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://doi.org/https://dl.acm.org/doi/10.1145/3580521\",\"RegionNum\":4,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Transactions on the Web","FirstCategoryId":"94","ListUrlMain":"https://doi.org/https://dl.acm.org/doi/10.1145/3580521","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Deep Adaptive Graph Clustering via von Mises-Fisher Distributions
Graph clustering has been a hot research topic and is widely used in many fields, such as community detection in social networks. Lots of works combining auto-encoder and graph neural networks have been applied to clustering tasks by utilizing node attributes and graph structure. These works usually assumed the inherent parameters (i.e. size and variance) of different clusters in the latent embedding space are homogeneous, and hence the assigned probability is monotonous over the Euclidean distance between node embeddings and centroids. Unfortunately, this assumption usually does not hold since the size and concentration of different clusters can be quite different, which limits the clustering accuracy. In addition, the node embeddings in deep graph clustering methods are usually L2 normalized so that it lies on the surface of a unit hyper-sphere. To solve this problem, we proposed Deep Adaptive Graph Clustering via von Mises-Fisher distributions, namely DAGC. DAGC assumes the node embeddings H can be drawn from a von Mises-Fisher distribution and each cluster k is associated with cluster inherent parameters ρk which includes cluster center μ and cluster cohesion degree κ. Then we adopt an EM-like approach (i.e. \(\mathcal {P}(\mathbf {H}|\mathbf {\rho }) \) and \(\mathcal {P}(\mathbf {\rho }|\mathbf {H}) \) respectively) to learn the embedding and cluster inherent parameters alternately. Specifically, with the node embeddings, we proposed to update the cluster centers in an attraction-repulsion manner to make the cluster centers more separable. And given the cluster inherent parameters, a likelihood-based loss is proposed to make node embeddings more concentrated around cluster centers. Thus, DAGC can simultaneously improve the intra-cluster compactness and inter-cluster heterogeneity. Finally, extensive experiments conducted on four benchmark datasets have demonstrated that the proposed DAGC consistently outperforms the state-of-the-art methods, especially on imbalanced datasets.
期刊介绍:
Transactions on the Web (TWEB) is a journal publishing refereed articles reporting the results of research on Web content, applications, use, and related enabling technologies. Topics in the scope of TWEB include but are not limited to the following: Browsers and Web Interfaces; Electronic Commerce; Electronic Publishing; Hypertext and Hypermedia; Semantic Web; Web Engineering; Web Services; and Service-Oriented Computing XML.
In addition, papers addressing the intersection of the following broader technologies with the Web are also in scope: Accessibility; Business Services Education; Knowledge Management and Representation; Mobility and pervasive computing; Performance and scalability; Recommender systems; Searching, Indexing, Classification, Retrieval and Querying, Data Mining and Analysis; Security and Privacy; and User Interfaces.
Papers discussing specific Web technologies, applications, content generation and management and use are within scope. Also, papers describing novel applications of the web as well as papers on the underlying technologies are welcome.