无监督图像检索中的去偏交叉对比量化

IF 7.5 3区 计算机科学 Q1 COMPUTER SCIENCE, INFORMATION SYSTEMS
Zipeng Chen;Yuan-Gen Wang;Lin-Cheng Li
{"title":"无监督图像检索中的去偏交叉对比量化","authors":"Zipeng Chen;Yuan-Gen Wang;Lin-Cheng Li","doi":"10.1109/TBDATA.2024.3453751","DOIUrl":null,"url":null,"abstract":"Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.","PeriodicalId":13106,"journal":{"name":"IEEE Transactions on Big Data","volume":"11 3","pages":"1298-1308"},"PeriodicalIF":7.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Debiased Cross Contrastive Quantization for Unsupervised Image Retrieval\",\"authors\":\"Zipeng Chen;Yuan-Gen Wang;Lin-Cheng Li\",\"doi\":\"10.1109/TBDATA.2024.3453751\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.\",\"PeriodicalId\":13106,\"journal\":{\"name\":\"IEEE Transactions on Big Data\",\"volume\":\"11 3\",\"pages\":\"1298-1308\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Big Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10663968/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Big Data","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10663968/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
引用次数: 0

摘要

对比量化(将矢量量化应用于对比学习)以其计算效率高、存储空间小的优点在大规模图像检索中取得了巨大的成功。本文设计了一种同时优化交叉量化和去偏对比学习的优化框架,称为去偏交叉对比量化(debiased cross contrast quantization, DCCQ)。该框架在端到端网络中实现,既减少了量化误差,又删除了许多假阴性样本。具体来说,为了提高码字之间的可分辨性,DCCQ在网络训练中引入了码字相似度损失和软量化熵损失。此外,采用记忆库策略和多作物图像增强策略来提高对比学习的有效性和效率。在三个大规模真实图像基准数据集上的大量实验表明,所提出的DCCQ产生了最先进的结果。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Debiased Cross Contrastive Quantization for Unsupervised Image Retrieval
Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
CiteScore
11.80
自引率
2.80%
发文量
114
期刊介绍: The IEEE Transactions on Big Data publishes peer-reviewed articles focusing on big data. These articles present innovative research ideas and application results across disciplines, including novel theories, algorithms, and applications. Research areas cover a wide range, such as big data analytics, visualization, curation, management, semantics, infrastructure, standards, performance analysis, intelligence extraction, scientific discovery, security, privacy, and legal issues specific to big data. The journal also prioritizes applications of big data in fields generating massive datasets.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信