{"title":"Towards Highly-Efficient k-Nearest Neighbor Algorithm for Big Data Classification","authors":"H. I. Abdalla, A. Amer","doi":"10.1109/NISS55057.2022.10085013","DOIUrl":null,"url":null,"abstract":"the k-nearest neighbors (kNN) algorithm is naturally used to search for the nearest neighbors of a test point in a feature space. A large number of works have been developed in the literature to accelerate the speed of data classification using kNN. In parallel with these works, we present a novel K-nearest neighbor variation with neighboring calculation property, called NCP-kNN. NCP-kNN comes to solve the search complexity of kNN as well as the issue of high-dimensional classification. In fact, these two problems cause an exponentially increasing level of complexity, particularly with big datasets and multiple k values. In NCP-kNN, every test point’s distance is checked with only a limited number of training points instead of the entire dataset. Experimental results on six small datasets, show that the performance of NCP-kNN is equivalent to that of standard kNN on small and big datasets, with NCP-kNN being highly efficient. Furthermore, surprisingly, results on big datasets demonstrate that NCP-kNN is not just faster than standard kNN but also significantly superior. The findings, on the whole, show that NCP-kNN is a promising technique as a highly-efficient kNN variation for big data classification.","PeriodicalId":138637,"journal":{"name":"2022 5th International Conference on Networking, Information Systems and Security: Envisage Intelligent Systems in 5g//6G-based Interconnected Digital Worlds (NISS)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 5th International Conference on Networking, Information Systems and Security: Envisage Intelligent Systems in 5g//6G-based Interconnected Digital Worlds (NISS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/NISS55057.2022.10085013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
the k-nearest neighbors (kNN) algorithm is naturally used to search for the nearest neighbors of a test point in a feature space. A large number of works have been developed in the literature to accelerate the speed of data classification using kNN. In parallel with these works, we present a novel K-nearest neighbor variation with neighboring calculation property, called NCP-kNN. NCP-kNN comes to solve the search complexity of kNN as well as the issue of high-dimensional classification. In fact, these two problems cause an exponentially increasing level of complexity, particularly with big datasets and multiple k values. In NCP-kNN, every test point’s distance is checked with only a limited number of training points instead of the entire dataset. Experimental results on six small datasets, show that the performance of NCP-kNN is equivalent to that of standard kNN on small and big datasets, with NCP-kNN being highly efficient. Furthermore, surprisingly, results on big datasets demonstrate that NCP-kNN is not just faster than standard kNN but also significantly superior. The findings, on the whole, show that NCP-kNN is a promising technique as a highly-efficient kNN variation for big data classification.