{"title":"无监督图像检索中的去偏交叉对比量化","authors":"Zipeng Chen;Yuan-Gen Wang;Lin-Cheng Li","doi":"10.1109/TBDATA.2024.3453751","DOIUrl":null,"url":null,"abstract":"Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.","PeriodicalId":13106,"journal":{"name":"IEEE Transactions on Big Data","volume":"11 3","pages":"1298-1308"},"PeriodicalIF":7.5000,"publicationDate":"2024-09-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Debiased Cross Contrastive Quantization for Unsupervised Image Retrieval\",\"authors\":\"Zipeng Chen;Yuan-Gen Wang;Lin-Cheng Li\",\"doi\":\"10.1109/TBDATA.2024.3453751\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.\",\"PeriodicalId\":13106,\"journal\":{\"name\":\"IEEE Transactions on Big Data\",\"volume\":\"11 3\",\"pages\":\"1298-1308\"},\"PeriodicalIF\":7.5000,\"publicationDate\":\"2024-09-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"IEEE Transactions on Big Data\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://ieeexplore.ieee.org/document/10663968/\",\"RegionNum\":3,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, INFORMATION SYSTEMS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Big Data","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10663968/","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INFORMATION SYSTEMS","Score":null,"Total":0}
Debiased Cross Contrastive Quantization for Unsupervised Image Retrieval
Contrastive quantization (applying vector quantization to contrastive learning) has achieved great success in large-scale image retrieval because of its advantage of high computational efficiency and small storage space. This article designs a novel optimization framework to simultaneously optimize the cross quantization and the debiased contrastive learning, termed Debiased Cross Contrastive Quantization (DCCQ). The proposed framework is implemented in an end-to-end network, resulting in both reduced quantization error and deletion of many false negative samples. Specifically, to increase the distinguishability between codewords, DCCQ introduces the codeword similarity loss and soft quantization entropy loss for network training. Furthermore, the memory bank strategy and multi-crop image augmentation strategy are employed to promote the effectiveness and efficiency of contrastive learning. Extensive experiments on three large-scale real image benchmark datasets show that the proposed DCCQ yields state-of-the-art results.
期刊介绍:
The IEEE Transactions on Big Data publishes peer-reviewed articles focusing on big data. These articles present innovative research ideas and application results across disciplines, including novel theories, algorithms, and applications. Research areas cover a wide range, such as big data analytics, visualization, curation, management, semantics, infrastructure, standards, performance analysis, intelligence extraction, scientific discovery, security, privacy, and legal issues specific to big data. The journal also prioritizes applications of big data in fields generating massive datasets.