基于固定网格方法的快速$k-\text{NN}$算法

G. Jan, Kuan-Lin Su, Hui-Ching Hsieh, C. Luo
{"title":"基于固定网格方法的快速$k-\\text{NN}$算法","authors":"G. Jan, Kuan-Lin Su, Hui-Ching Hsieh, C. Luo","doi":"10.1109/airc56195.2022.9836452","DOIUrl":null,"url":null,"abstract":"$k-\\text{Nearest}$ Neighbor $(k-\\text{NN})$ is a well-known instance-based learning algorithm; widely used in pattern recognition. A $k-\\text{NN}$ classifier can generate highly accurate predictions if provided with sufficient training instances. Thus, it plays a pivotal role in many fields. However, though its accuracy can improve with more data, the need for computational resources increases as well. In this paper, we propose a novel approach which pre-partitions instance space into smaller cells in order to reduce computational cost and greatly reduce the time complexity. Assume every instance is mapped to a point in the $d-\\text{dimensional}$ Euclidean space and a training set $D$ contains $n$ training instances. Given a query instance, the brute force $k-\\text{NN}$ algorithm has the $O(nd)$ time complexity for predicting the class of a query instance. This algorithm can improve the time complexity.","PeriodicalId":147463,"journal":{"name":"2022 3rd International Conference on Artificial Intelligence, Robotics and Control (AIRC)","volume":"457 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-05-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"The Fast $k-\\\\text{NN}$ Algorithm Based on a Fixed Grid Method\",\"authors\":\"G. Jan, Kuan-Lin Su, Hui-Ching Hsieh, C. Luo\",\"doi\":\"10.1109/airc56195.2022.9836452\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"$k-\\\\text{Nearest}$ Neighbor $(k-\\\\text{NN})$ is a well-known instance-based learning algorithm; widely used in pattern recognition. A $k-\\\\text{NN}$ classifier can generate highly accurate predictions if provided with sufficient training instances. Thus, it plays a pivotal role in many fields. However, though its accuracy can improve with more data, the need for computational resources increases as well. In this paper, we propose a novel approach which pre-partitions instance space into smaller cells in order to reduce computational cost and greatly reduce the time complexity. Assume every instance is mapped to a point in the $d-\\\\text{dimensional}$ Euclidean space and a training set $D$ contains $n$ training instances. Given a query instance, the brute force $k-\\\\text{NN}$ algorithm has the $O(nd)$ time complexity for predicting the class of a query instance. This algorithm can improve the time complexity.\",\"PeriodicalId\":147463,\"journal\":{\"name\":\"2022 3rd International Conference on Artificial Intelligence, Robotics and Control (AIRC)\",\"volume\":\"457 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-05-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 3rd International Conference on Artificial Intelligence, Robotics and Control (AIRC)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/airc56195.2022.9836452\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 3rd International Conference on Artificial Intelligence, Robotics and Control (AIRC)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/airc56195.2022.9836452","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

$k-\text{Nearest}$ Neighbor $(k-\text{NN})$是一种著名的基于实例的学习算法;广泛应用于模式识别。如果提供足够的训练实例,$k-\text{NN}$分类器可以生成高度准确的预测。因此,它在许多领域起着举足轻重的作用。然而,尽管它的准确性可以随着数据的增加而提高,但对计算资源的需求也会增加。在本文中,我们提出了一种新的方法,即将实例空间预先划分为更小的单元,以减少计算成本并大大降低时间复杂度。假设每个实例都映射到$d-\text{dimensional}$欧几里德空间中的一个点,并且一个训练集$d $包含$n$训练实例。给定一个查询实例,蛮力$k-\text{NN}$算法在预测查询实例的类时具有$O(nd)$时间复杂度。该算法可以提高时间复杂度。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
The Fast $k-\text{NN}$ Algorithm Based on a Fixed Grid Method
$k-\text{Nearest}$ Neighbor $(k-\text{NN})$ is a well-known instance-based learning algorithm; widely used in pattern recognition. A $k-\text{NN}$ classifier can generate highly accurate predictions if provided with sufficient training instances. Thus, it plays a pivotal role in many fields. However, though its accuracy can improve with more data, the need for computational resources increases as well. In this paper, we propose a novel approach which pre-partitions instance space into smaller cells in order to reduce computational cost and greatly reduce the time complexity. Assume every instance is mapped to a point in the $d-\text{dimensional}$ Euclidean space and a training set $D$ contains $n$ training instances. Given a query instance, the brute force $k-\text{NN}$ algorithm has the $O(nd)$ time complexity for predicting the class of a query instance. This algorithm can improve the time complexity.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信