{"title":"Self-Supervised Locality-Sensitive Deep Hashing for the Robust Retrieval of Degraded Images","authors":"Lingyun Xiang;Hailang Hu;Qian Li;Hao Yu;Xiaobo Shen","doi":"10.1109/TIFS.2025.3531104","DOIUrl":null,"url":null,"abstract":"Recently, numerous degraded images have flooded search engines and social networks, finding extensive and practical applications in the real world. However, these images have also posed new challenges to conventional image retrieval tasks. To this end, we introduce a new task of retrieving degraded images through deep hashing from large-scale databases, and further present the Locality-Sensitive Hashing Network (LSHNet) to tackle it in a self-supervised manner. More specifically, we first propose a triplet strategy to enable the self-supervised training of LSHNet in an end-to-end fashion. Due to the designed strategy, the highly semantic similarity and discrimination of degraded images are well-preserved in our learned latent codes without requiring additional human labor in labeling tons of degraded images. Moreover, to tackle large-scale image retrieval efficiently, we further propose to transform the latent codes into locality-sensitive hashing codes such that the degraded images can be retrieved in sublinear time with their representation ability almost unaffected. Extensive experiments are conducted on three public benchmarks where the results demonstrate the superior performance of LSHNet in retrieving similar images under degraded conditions.","PeriodicalId":13492,"journal":{"name":"IEEE Transactions on Information Forensics and Security","volume":"20 ","pages":"1582-1596"},"PeriodicalIF":6.3000,"publicationDate":"2025-01-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Transactions on Information Forensics and Security","FirstCategoryId":"94","ListUrlMain":"https://ieeexplore.ieee.org/document/10851322/","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
引用次数: 0
Abstract
Recently, numerous degraded images have flooded search engines and social networks, finding extensive and practical applications in the real world. However, these images have also posed new challenges to conventional image retrieval tasks. To this end, we introduce a new task of retrieving degraded images through deep hashing from large-scale databases, and further present the Locality-Sensitive Hashing Network (LSHNet) to tackle it in a self-supervised manner. More specifically, we first propose a triplet strategy to enable the self-supervised training of LSHNet in an end-to-end fashion. Due to the designed strategy, the highly semantic similarity and discrimination of degraded images are well-preserved in our learned latent codes without requiring additional human labor in labeling tons of degraded images. Moreover, to tackle large-scale image retrieval efficiently, we further propose to transform the latent codes into locality-sensitive hashing codes such that the degraded images can be retrieved in sublinear time with their representation ability almost unaffected. Extensive experiments are conducted on three public benchmarks where the results demonstrate the superior performance of LSHNet in retrieving similar images under degraded conditions.
期刊介绍:
The IEEE Transactions on Information Forensics and Security covers the sciences, technologies, and applications relating to information forensics, information security, biometrics, surveillance and systems applications that incorporate these features