{"title":"基于变换残差量化的快速近邻搜索","authors":"Jiangbo Yuan, Xiuwen Liu","doi":"10.1109/ICMLA.2016.0175","DOIUrl":null,"url":null,"abstract":"Product quantization (PQ) and residual quantization (RQ) have been successfully used to solve fast nearest neighbor search problems thanks to their exponentially reduced complexities of both storage and computation with respect to the codebook size, Recent efforts have been focused on employing optimization strategies and seeking more effective models. Based on the observation that randomness typically increases in subsequent residual spaces, we propose a new strategy, called, transformed RQ (TRQ), that jointly learns a local transformation per residual cluster with an ultimate goal to further reduce overall quantization errors. Additionally we propose a hybrid approximate nearest search method based on the proposed TRQ and PQ. We show that our methods achieve significantly better accuracy on nearest neighbor search than both the original and the optimized PQ on several benchmark datasets.","PeriodicalId":356182,"journal":{"name":"2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA)","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2016-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"2","resultStr":"{\"title\":\"Fast Nearest Neighbor Search with Transformed Residual Quantization\",\"authors\":\"Jiangbo Yuan, Xiuwen Liu\",\"doi\":\"10.1109/ICMLA.2016.0175\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Product quantization (PQ) and residual quantization (RQ) have been successfully used to solve fast nearest neighbor search problems thanks to their exponentially reduced complexities of both storage and computation with respect to the codebook size, Recent efforts have been focused on employing optimization strategies and seeking more effective models. Based on the observation that randomness typically increases in subsequent residual spaces, we propose a new strategy, called, transformed RQ (TRQ), that jointly learns a local transformation per residual cluster with an ultimate goal to further reduce overall quantization errors. Additionally we propose a hybrid approximate nearest search method based on the proposed TRQ and PQ. We show that our methods achieve significantly better accuracy on nearest neighbor search than both the original and the optimized PQ on several benchmark datasets.\",\"PeriodicalId\":356182,\"journal\":{\"name\":\"2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"volume\":\"64 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2016-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"2\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICMLA.2016.0175\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2016 15th IEEE International Conference on Machine Learning and Applications (ICMLA)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICMLA.2016.0175","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Fast Nearest Neighbor Search with Transformed Residual Quantization
Product quantization (PQ) and residual quantization (RQ) have been successfully used to solve fast nearest neighbor search problems thanks to their exponentially reduced complexities of both storage and computation with respect to the codebook size, Recent efforts have been focused on employing optimization strategies and seeking more effective models. Based on the observation that randomness typically increases in subsequent residual spaces, we propose a new strategy, called, transformed RQ (TRQ), that jointly learns a local transformation per residual cluster with an ultimate goal to further reduce overall quantization errors. Additionally we propose a hybrid approximate nearest search method based on the proposed TRQ and PQ. We show that our methods achieve significantly better accuracy on nearest neighbor search than both the original and the optimized PQ on several benchmark datasets.