Tiziano Portenier, Qiyang Hu, P. Favaro, Matthias Zwicker
{"title":"SmartSketcher: sketch-based image retrieval with dynamic semantic re-ranking","authors":"Tiziano Portenier, Qiyang Hu, P. Favaro, Matthias Zwicker","doi":"10.1145/3092907.3092910","DOIUrl":null,"url":null,"abstract":"We present a sketch-based image retrieval system, designed to answer arbitrary queries that may go beyond searching for predefined object or scene categories. While sketching is fast and intuitive to formulate visual queries, pure sketch-based image retrieval often returns many outliers because it lacks a semantic understanding of the query. Our key idea is to combine sketch-based queries with inter-active, semantic re-ranking of query results. We leverage progress in deep learning and use a feature representation learned for image classification for re-ranking. This allows us to cluster semantically similar images, re-rank based on the clusters, and present more meaningful query results to the user. We report on two large-scale benchmarks and demonstrate that our re-ranking approach leads to significant improvements over the state of the art. Finally, a user study designed to evaluate a practical use case confirms the benefits of our approach.","PeriodicalId":393945,"journal":{"name":"Proceedings of the Symposium on Sketch-Based Interfaces and Modeling","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-07-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the Symposium on Sketch-Based Interfaces and Modeling","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3092907.3092910","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
We present a sketch-based image retrieval system, designed to answer arbitrary queries that may go beyond searching for predefined object or scene categories. While sketching is fast and intuitive to formulate visual queries, pure sketch-based image retrieval often returns many outliers because it lacks a semantic understanding of the query. Our key idea is to combine sketch-based queries with inter-active, semantic re-ranking of query results. We leverage progress in deep learning and use a feature representation learned for image classification for re-ranking. This allows us to cluster semantically similar images, re-rank based on the clusters, and present more meaningful query results to the user. We report on two large-scale benchmarks and demonstrate that our re-ranking approach leads to significant improvements over the state of the art. Finally, a user study designed to evaluate a practical use case confirms the benefits of our approach.