Improving the Performance of kNN in the MapReduce Framework Using Locality Sensitive Hashing

S. Bagui, A. Mondal, S. Bagui
{"title":"Improving the Performance of kNN in the MapReduce Framework Using Locality Sensitive Hashing","authors":"S. Bagui, A. Mondal, S. Bagui","doi":"10.4018/ijdst.2019100101","DOIUrl":null,"url":null,"abstract":"In this work the authors present a parallel k nearest neighbor (kNN) algorithm using locality sensitive hashing to preprocess the data before it is classified using kNN in Hadoop's MapReduce framework. This is compared with the sequential (conventional) implementation. Using locality sensitive hashing's similarity measure with kNN, the iterative procedure to classify a data object is performed within a hash bucket rather than the whole data set, greatly reducing the computation time needed for classification. Several experiments were run that showed that the parallel implementation performed better than the sequential implementation on very large datasets. The study also experimented with a few map and reduce side optimization features for the parallel implementation and presented some optimum map and reduce side parameters. Among the map side parameters, the block size and input split size were varied, and among the reduce side parameters, the number of planes were varied, and their effects were studied.","PeriodicalId":118536,"journal":{"name":"Int. J. Distributed Syst. Technol.","volume":"142 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Distributed Syst. Technol.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4018/ijdst.2019100101","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 3

Abstract

In this work the authors present a parallel k nearest neighbor (kNN) algorithm using locality sensitive hashing to preprocess the data before it is classified using kNN in Hadoop's MapReduce framework. This is compared with the sequential (conventional) implementation. Using locality sensitive hashing's similarity measure with kNN, the iterative procedure to classify a data object is performed within a hash bucket rather than the whole data set, greatly reducing the computation time needed for classification. Several experiments were run that showed that the parallel implementation performed better than the sequential implementation on very large datasets. The study also experimented with a few map and reduce side optimization features for the parallel implementation and presented some optimum map and reduce side parameters. Among the map side parameters, the block size and input split size were varied, and among the reduce side parameters, the number of planes were varied, and their effects were studied.
利用位置敏感哈希提高MapReduce框架中kNN的性能
在这项工作中,作者提出了一个并行k最近邻(kNN)算法,该算法使用位置敏感散列对数据进行预处理,然后在Hadoop的MapReduce框架中使用kNN对数据进行分类。这与顺序(传统)实现进行了比较。利用局部敏感哈希算法与kNN的相似性度量,分类数据对象的迭代过程在一个哈希桶内而不是在整个数据集中进行,大大减少了分类所需的计算时间。几个实验表明,在非常大的数据集上,并行实现比顺序实现表现得更好。在此基础上,实验了几种并行实现的map和reduce side优化特征,并给出了一些最优的map和reduce side参数。在地图侧参数中,改变了块大小和输入分割大小,在减少侧参数中,改变了面数,并研究了它们的影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信