{"title":"带有自适应近邻的贝叶斯决定线性 KNN","authors":"Jin Zhang, Zekang Bian, Shitong Wang","doi":"10.1155/2024/6664942","DOIUrl":null,"url":null,"abstract":"<p>While the classical KNN (<i>k</i> nearest neighbor) shares its avoidance of the consistent distribution assumption between training and testing samples to achieve fast prediction, it still faces two challenges: (a) its generalization ability heavily depends on an appropriate number <i>k</i> of nearest neighbors; (b) its prediction behavior lacks interpretability. In order to address the two challenges, a novel Bayes-decisive linear KNN with adaptive nearest neighbors (<i>i.e</i>., BLA-KNN) is proposed to obtain the following three merits: (a) a diagonal matrix is introduced to adaptively select the nearest neighbors and simultaneously improve the generalization capability of the proposed BLA-KNN method; (b) the proposed BLA-KNN method owns the group effect, which inherits and extends the group property of the sum of squares for total deviations by reflecting the training sample class-aware information in the group effect regularization term; (c) the prediction behavior of the proposed BLA-KNN method can be interpreted from the Bayes-decision-rule perspective. In order to do so, we first use a diagonal matrix to weigh each training sample so as to obtain the importance of the sample, while constraining the importance weights to ensure that the adaptive <i>k</i> value is carried out efficiently. Second, we introduce a class-aware information regularization term in the objective function to obtain the nearest neighbor group effect of the samples. Finally, we introduce linear expression weights related to the distance measure between the testing and training samples in the regularization term to ensure that the interpretation of Bayes-decision-rule can be performed smoothly. We also optimize the proposed objective function using an alternating optimization strategy. We experimentally demonstrate the effectiveness of the proposed BLA-KNN method by comparing it with 7 comparative methods on 15 benchmark datasets.</p>","PeriodicalId":14089,"journal":{"name":"International Journal of Intelligent Systems","volume":null,"pages":null},"PeriodicalIF":5.0000,"publicationDate":"2024-01-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bayes-Decisive Linear KNN with Adaptive Nearest Neighbors\",\"authors\":\"Jin Zhang, Zekang Bian, Shitong Wang\",\"doi\":\"10.1155/2024/6664942\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>While the classical KNN (<i>k</i> nearest neighbor) shares its avoidance of the consistent distribution assumption between training and testing samples to achieve fast prediction, it still faces two challenges: (a) its generalization ability heavily depends on an appropriate number <i>k</i> of nearest neighbors; (b) its prediction behavior lacks interpretability. In order to address the two challenges, a novel Bayes-decisive linear KNN with adaptive nearest neighbors (<i>i.e</i>., BLA-KNN) is proposed to obtain the following three merits: (a) a diagonal matrix is introduced to adaptively select the nearest neighbors and simultaneously improve the generalization capability of the proposed BLA-KNN method; (b) the proposed BLA-KNN method owns the group effect, which inherits and extends the group property of the sum of squares for total deviations by reflecting the training sample class-aware information in the group effect regularization term; (c) the prediction behavior of the proposed BLA-KNN method can be interpreted from the Bayes-decision-rule perspective. In order to do so, we first use a diagonal matrix to weigh each training sample so as to obtain the importance of the sample, while constraining the importance weights to ensure that the adaptive <i>k</i> value is carried out efficiently. Second, we introduce a class-aware information regularization term in the objective function to obtain the nearest neighbor group effect of the samples. Finally, we introduce linear expression weights related to the distance measure between the testing and training samples in the regularization term to ensure that the interpretation of Bayes-decision-rule can be performed smoothly. We also optimize the proposed objective function using an alternating optimization strategy. We experimentally demonstrate the effectiveness of the proposed BLA-KNN method by comparing it with 7 comparative methods on 15 benchmark datasets.</p>\",\"PeriodicalId\":14089,\"journal\":{\"name\":\"International Journal of Intelligent Systems\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":5.0000,\"publicationDate\":\"2024-01-24\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"International Journal of Intelligent Systems\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://onlinelibrary.wiley.com/doi/10.1155/2024/6664942\",\"RegionNum\":2,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"International Journal of Intelligent Systems","FirstCategoryId":"94","ListUrlMain":"https://onlinelibrary.wiley.com/doi/10.1155/2024/6664942","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
Bayes-Decisive Linear KNN with Adaptive Nearest Neighbors
While the classical KNN (k nearest neighbor) shares its avoidance of the consistent distribution assumption between training and testing samples to achieve fast prediction, it still faces two challenges: (a) its generalization ability heavily depends on an appropriate number k of nearest neighbors; (b) its prediction behavior lacks interpretability. In order to address the two challenges, a novel Bayes-decisive linear KNN with adaptive nearest neighbors (i.e., BLA-KNN) is proposed to obtain the following three merits: (a) a diagonal matrix is introduced to adaptively select the nearest neighbors and simultaneously improve the generalization capability of the proposed BLA-KNN method; (b) the proposed BLA-KNN method owns the group effect, which inherits and extends the group property of the sum of squares for total deviations by reflecting the training sample class-aware information in the group effect regularization term; (c) the prediction behavior of the proposed BLA-KNN method can be interpreted from the Bayes-decision-rule perspective. In order to do so, we first use a diagonal matrix to weigh each training sample so as to obtain the importance of the sample, while constraining the importance weights to ensure that the adaptive k value is carried out efficiently. Second, we introduce a class-aware information regularization term in the objective function to obtain the nearest neighbor group effect of the samples. Finally, we introduce linear expression weights related to the distance measure between the testing and training samples in the regularization term to ensure that the interpretation of Bayes-decision-rule can be performed smoothly. We also optimize the proposed objective function using an alternating optimization strategy. We experimentally demonstrate the effectiveness of the proposed BLA-KNN method by comparing it with 7 comparative methods on 15 benchmark datasets.
期刊介绍:
The International Journal of Intelligent Systems serves as a forum for individuals interested in tapping into the vast theories based on intelligent systems construction. With its peer-reviewed format, the journal explores several fascinating editorials written by today''s experts in the field. Because new developments are being introduced each day, there''s much to be learned — examination, analysis creation, information retrieval, man–computer interactions, and more. The International Journal of Intelligent Systems uses charts and illustrations to demonstrate these ground-breaking issues, and encourages readers to share their thoughts and experiences.