Xin Luo, P. Zhang, Ye Wu, Zhen-Duo Chen, Hua-Junjie Huang, Xin-Shun Xu
{"title":"非对称离散跨模态哈希","authors":"Xin Luo, P. Zhang, Ye Wu, Zhen-Duo Chen, Hua-Junjie Huang, Xin-Shun Xu","doi":"10.1145/3206025.3206034","DOIUrl":null,"url":null,"abstract":"Recently, cross-modal hashing (CMH) methods have attracted much attention. Many methods have been explored; however, there are still some issues that need to be further considered. 1) How to efficiently construct the correlations among heterogeneous modalities. 2) How to solve the NP-hard optimization problem and avoid the large quantization errors generated by relaxation. 3) How to handle the complex and difficult problem in most CMH methods that simultaneously learning the hash codes and hash functions. To address these challenges, we present a novel cross-modal hashing algorithm, named Asymmetric Discrete Cross-Modal Hashing (ADCH). Specifically, it leverages the collective matrix factorization technique to learn the common latent representations while preserving not only the cross-correlation from different modalities but also the semantic similarity. Instead of relaxing the binary constraints, it generates the hash codes directly using an iterative optimization algorithm proposed in this work. Based the learnt hash codes, ADCH further learns a series of binary classifiers as hash functions, which is flexible and effective. Extensive experiments are conducted on three real-world datasets. The results demonstrate that ADCH outperforms several state-of-the-art cross-modal hashing baselines.","PeriodicalId":224132,"journal":{"name":"Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-06-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"23","resultStr":"{\"title\":\"Asymmetric Discrete Cross-Modal Hashing\",\"authors\":\"Xin Luo, P. Zhang, Ye Wu, Zhen-Duo Chen, Hua-Junjie Huang, Xin-Shun Xu\",\"doi\":\"10.1145/3206025.3206034\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently, cross-modal hashing (CMH) methods have attracted much attention. Many methods have been explored; however, there are still some issues that need to be further considered. 1) How to efficiently construct the correlations among heterogeneous modalities. 2) How to solve the NP-hard optimization problem and avoid the large quantization errors generated by relaxation. 3) How to handle the complex and difficult problem in most CMH methods that simultaneously learning the hash codes and hash functions. To address these challenges, we present a novel cross-modal hashing algorithm, named Asymmetric Discrete Cross-Modal Hashing (ADCH). Specifically, it leverages the collective matrix factorization technique to learn the common latent representations while preserving not only the cross-correlation from different modalities but also the semantic similarity. Instead of relaxing the binary constraints, it generates the hash codes directly using an iterative optimization algorithm proposed in this work. Based the learnt hash codes, ADCH further learns a series of binary classifiers as hash functions, which is flexible and effective. Extensive experiments are conducted on three real-world datasets. The results demonstrate that ADCH outperforms several state-of-the-art cross-modal hashing baselines.\",\"PeriodicalId\":224132,\"journal\":{\"name\":\"Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2018-06-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"23\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3206025.3206034\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2018 ACM on International Conference on Multimedia Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3206025.3206034","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Recently, cross-modal hashing (CMH) methods have attracted much attention. Many methods have been explored; however, there are still some issues that need to be further considered. 1) How to efficiently construct the correlations among heterogeneous modalities. 2) How to solve the NP-hard optimization problem and avoid the large quantization errors generated by relaxation. 3) How to handle the complex and difficult problem in most CMH methods that simultaneously learning the hash codes and hash functions. To address these challenges, we present a novel cross-modal hashing algorithm, named Asymmetric Discrete Cross-Modal Hashing (ADCH). Specifically, it leverages the collective matrix factorization technique to learn the common latent representations while preserving not only the cross-correlation from different modalities but also the semantic similarity. Instead of relaxing the binary constraints, it generates the hash codes directly using an iterative optimization algorithm proposed in this work. Based the learnt hash codes, ADCH further learns a series of binary classifiers as hash functions, which is flexible and effective. Extensive experiments are conducted on three real-world datasets. The results demonstrate that ADCH outperforms several state-of-the-art cross-modal hashing baselines.