Operational Advice for Dense and Sparse Retrievers: HNSW, Flat, or Inverted Indexes?

Jimmy Lin
{"title":"Operational Advice for Dense and Sparse Retrievers: HNSW, Flat, or Inverted Indexes?","authors":"Jimmy Lin","doi":"arxiv-2409.06464","DOIUrl":null,"url":null,"abstract":"Practitioners working on dense retrieval today face a bewildering number of\nchoices. Beyond selecting the embedding model, another consequential choice is\nthe actual implementation of nearest-neighbor vector search. While best\npractices recommend HNSW indexes, flat vector indexes with brute-force search\nrepresent another viable option, particularly for smaller corpora and for rapid\nprototyping. In this paper, we provide experimental results on the BEIR dataset\nusing the open-source Lucene search library that explicate the tradeoffs\nbetween HNSW and flat indexes (including quantized variants) from the\nperspectives of indexing time, query evaluation performance, and retrieval\nquality. With additional comparisons between dense and sparse retrievers, our\nresults provide guidance for today's search practitioner in understanding the\ndesign space of dense and sparse retrievers. To our knowledge, we are the first\nto provide operational advice supported by empirical experiments in this\nregard.","PeriodicalId":501281,"journal":{"name":"arXiv - CS - Information Retrieval","volume":"14 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Information Retrieval","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06464","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

Practitioners working on dense retrieval today face a bewildering number of choices. Beyond selecting the embedding model, another consequential choice is the actual implementation of nearest-neighbor vector search. While best practices recommend HNSW indexes, flat vector indexes with brute-force search represent another viable option, particularly for smaller corpora and for rapid prototyping. In this paper, we provide experimental results on the BEIR dataset using the open-source Lucene search library that explicate the tradeoffs between HNSW and flat indexes (including quantized variants) from the perspectives of indexing time, query evaluation performance, and retrieval quality. With additional comparisons between dense and sparse retrievers, our results provide guidance for today's search practitioner in understanding the design space of dense and sparse retrievers. To our knowledge, we are the first to provide operational advice supported by empirical experiments in this regard.
密集型和稀疏型检索器的操作建议:HNSW、平指数还是倒指数?
如今,从事高密度检索的从业人员面临着大量令人困惑的选择。除了选择嵌入模型外,另一个重要的选择是近邻矢量搜索的实际实现。虽然最佳实践推荐使用 HNSW 索引,但使用暴力搜索的扁平矢量索引是另一种可行的选择,尤其适用于较小的语料库和快速原型开发。本文提供了使用开源 Lucene 搜索库在 BEIR 数据集上的实验结果,从索引时间、查询评估性能和检索质量等角度阐述了 HNSW 和扁平索引(包括量化变体)之间的权衡。通过对密集检索器和稀疏检索器进行更多比较,我们的研究结果为当今的搜索从业者了解密集检索器和稀疏检索器的设计空间提供了指导。据我们所知,我们是第一个在这方面提供有经验实验支持的操作建议的人。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信