Deep Multiple Length Hashing via Multi-task Learning

Letian Wang, Xiushan Nie, Quan Zhou, Yang Shi, Xingbo Liu
{"title":"Deep Multiple Length Hashing via Multi-task Learning","authors":"Letian Wang, Xiushan Nie, Quan Zhou, Yang Shi, Xingbo Liu","doi":"10.1145/3469877.3493591","DOIUrl":null,"url":null,"abstract":"Hashing can compress heterogeneous high-dimensional data into compact binary codes. For most existing hash methods, they first predetermine a fixed length for the hash code and then train the model based on this fixed length. However, when the task requirements change, these methods need to retrain the model for a new length of hash codes, which increases time cost. To address this issue, we propose a deep supervised hashing method, called deep multiple length hashing(DMLH), which can learn multiple length hash codes simultaneously based on a multi-task learning network. This proposed DMLH can well utilize the relationships with a hard parameter sharing-based multi-task network. Specifically, in DMLH, the multiple hash codes with different lengths are regarded as different views of the same sample. Furthermore, we introduce a type of mutual information loss to mine the association among hash codes of different lengths. Extensive experiments have indicated that DMLH outperforms most existing models, verifying its effectiveness.","PeriodicalId":210974,"journal":{"name":"ACM Multimedia Asia","volume":"44 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACM Multimedia Asia","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3469877.3493591","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

Hashing can compress heterogeneous high-dimensional data into compact binary codes. For most existing hash methods, they first predetermine a fixed length for the hash code and then train the model based on this fixed length. However, when the task requirements change, these methods need to retrain the model for a new length of hash codes, which increases time cost. To address this issue, we propose a deep supervised hashing method, called deep multiple length hashing(DMLH), which can learn multiple length hash codes simultaneously based on a multi-task learning network. This proposed DMLH can well utilize the relationships with a hard parameter sharing-based multi-task network. Specifically, in DMLH, the multiple hash codes with different lengths are regarded as different views of the same sample. Furthermore, we introduce a type of mutual information loss to mine the association among hash codes of different lengths. Extensive experiments have indicated that DMLH outperforms most existing models, verifying its effectiveness.
基于多任务学习的深度多长度哈希
哈希可以将异构的高维数据压缩成紧凑的二进制代码。对于大多数现有的哈希方法,它们首先为哈希码预先确定一个固定长度,然后根据这个固定长度训练模型。然而,当任务需求发生变化时,这些方法需要为新的哈希码长度重新训练模型,这增加了时间成本。为了解决这个问题,我们提出了一种深度监督哈希方法,称为深度多长度哈希(DMLH),它可以基于多任务学习网络同时学习多个长度哈希码。基于硬参数共享的多任务网络可以很好地利用多任务网络之间的关系。具体来说,在DMLH中,不同长度的多个哈希码被视为同一样本的不同视图。此外,我们引入了一种互信息损失来挖掘不同长度哈希码之间的关联。大量的实验表明,DMLH优于大多数现有模型,验证了其有效性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信