高效跨模态哈希的联合聚类一元损失

Shifeng Zhang, Jianmin Li, Bo Zhang
{"title":"高效跨模态哈希的联合聚类一元损失","authors":"Shifeng Zhang, Jianmin Li, Bo Zhang","doi":"10.1145/3323873.3325059","DOIUrl":null,"url":null,"abstract":"Recently, cross-modal deep hashing has received broad attention for solving cross-modal retrieval problems efficiently. Most cross-modal hashing methods generate $O(n^2)$ data pairs and $O(n^3)$ data triplets for training, but the training procedure is less efficient because the complexity is high for large-scale dataset. In this paper, we propose a novel and efficient cross-modal hashing algorithm named Joint Cluster Cross-Modal Hashing (JCCH). First, We introduce the Cross-Modal Unary Loss (CMUL) with $O(n)$ complexity to bridge the traditional triplet loss and classification-based unary loss, and the JCCH algorithm is introduced with CMUL. Second, a more accurate bound of the triplet loss for structured multilabel data is introduced in CMUL. The resultant hashcodes form several clusters in which the hashcodes in the same cluster share similar semantic information, and the heterogeneity gap on different modalities is diminished by sharing the clusters. Experiments on large-scale datasets show that the proposed method is superior over or comparable with state-of-the-art cross-modal hashing methods, and training with the proposed method is more efficient than others.","PeriodicalId":149041,"journal":{"name":"Proceedings of the 2019 on International Conference on Multimedia Retrieval","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":"{\"title\":\"Joint Cluster Unary Loss for Efficient Cross-Modal Hashing\",\"authors\":\"Shifeng Zhang, Jianmin Li, Bo Zhang\",\"doi\":\"10.1145/3323873.3325059\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Recently, cross-modal deep hashing has received broad attention for solving cross-modal retrieval problems efficiently. Most cross-modal hashing methods generate $O(n^2)$ data pairs and $O(n^3)$ data triplets for training, but the training procedure is less efficient because the complexity is high for large-scale dataset. In this paper, we propose a novel and efficient cross-modal hashing algorithm named Joint Cluster Cross-Modal Hashing (JCCH). First, We introduce the Cross-Modal Unary Loss (CMUL) with $O(n)$ complexity to bridge the traditional triplet loss and classification-based unary loss, and the JCCH algorithm is introduced with CMUL. Second, a more accurate bound of the triplet loss for structured multilabel data is introduced in CMUL. The resultant hashcodes form several clusters in which the hashcodes in the same cluster share similar semantic information, and the heterogeneity gap on different modalities is diminished by sharing the clusters. Experiments on large-scale datasets show that the proposed method is superior over or comparable with state-of-the-art cross-modal hashing methods, and training with the proposed method is more efficient than others.\",\"PeriodicalId\":149041,\"journal\":{\"name\":\"Proceedings of the 2019 on International Conference on Multimedia Retrieval\",\"volume\":\"5 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-02-02\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"4\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Proceedings of the 2019 on International Conference on Multimedia Retrieval\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1145/3323873.3325059\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 2019 on International Conference on Multimedia Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3323873.3325059","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

摘要

近年来,跨模态深度哈希算法因能有效解决跨模态检索问题而受到广泛关注。大多数跨模态哈希方法生成$O(n^2)$数据对和$O(n^3)$数据三元组用于训练,但由于大规模数据集的复杂度较高,训练过程效率较低。本文提出了一种新颖高效的跨模态哈希算法——联合簇跨模态哈希(JCCH)。首先,我们引入复杂度为$O(n)$的跨模态一元损失(Cross-Modal Unary Loss, CMUL)来弥补传统的三元损失和基于分类的一元损失,并将CMUL引入JCCH算法。其次,在CMUL中引入了结构化多标签数据的更精确的三重态损失边界。生成的哈希码形成多个簇,其中同一簇中的哈希码共享相似的语义信息,并且通过共享簇减少了不同模态上的异构差距。在大规模数据集上的实验表明,该方法优于或可与最先进的跨模态哈希方法相媲美,并且使用该方法进行训练比其他方法更有效。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Joint Cluster Unary Loss for Efficient Cross-Modal Hashing
Recently, cross-modal deep hashing has received broad attention for solving cross-modal retrieval problems efficiently. Most cross-modal hashing methods generate $O(n^2)$ data pairs and $O(n^3)$ data triplets for training, but the training procedure is less efficient because the complexity is high for large-scale dataset. In this paper, we propose a novel and efficient cross-modal hashing algorithm named Joint Cluster Cross-Modal Hashing (JCCH). First, We introduce the Cross-Modal Unary Loss (CMUL) with $O(n)$ complexity to bridge the traditional triplet loss and classification-based unary loss, and the JCCH algorithm is introduced with CMUL. Second, a more accurate bound of the triplet loss for structured multilabel data is introduced in CMUL. The resultant hashcodes form several clusters in which the hashcodes in the same cluster share similar semantic information, and the heterogeneity gap on different modalities is diminished by sharing the clusters. Experiments on large-scale datasets show that the proposed method is superior over or comparable with state-of-the-art cross-modal hashing methods, and training with the proposed method is more efficient than others.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信