{"title":"On the complexity of probabilistic image retrieval","authors":"N. Vasconcelos","doi":"10.1109/ICCV.2001.937653","DOIUrl":null,"url":null,"abstract":"Probabilistic image retrieval approaches can lead to significant gains over standard retrieval techniques. However, this occurs at the cost of a significant increase in computational complexity. In fact, closed-form solutions for probabilistic retrieval are currently available only for simple representations such as the Gaussian and the histogram. We analyze the case of mixture densities and exploit the asymptotic equivalence between likelihood and Kullback-Leibler divergence to derive solutions for these models. In particular, (1) we show that the divergence can be computed exactly for vector quantizers and, (2) has an approximate solution for Gaussian mixtures that introduces no significant degradation of the resulting similarity judgments. In both cases, the new solutions have closed-form and computational complexity equivalent to that of standard retrieval approaches, but significantly better retrieval performance.","PeriodicalId":429441,"journal":{"name":"Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2001-07-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"60","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings Eighth IEEE International Conference on Computer Vision. ICCV 2001","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICCV.2001.937653","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 60
Abstract
Probabilistic image retrieval approaches can lead to significant gains over standard retrieval techniques. However, this occurs at the cost of a significant increase in computational complexity. In fact, closed-form solutions for probabilistic retrieval are currently available only for simple representations such as the Gaussian and the histogram. We analyze the case of mixture densities and exploit the asymptotic equivalence between likelihood and Kullback-Leibler divergence to derive solutions for these models. In particular, (1) we show that the divergence can be computed exactly for vector quantizers and, (2) has an approximate solution for Gaussian mixtures that introduces no significant degradation of the resulting similarity judgments. In both cases, the new solutions have closed-form and computational complexity equivalent to that of standard retrieval approaches, but significantly better retrieval performance.