{"title":"一个通用的双视图框架,例如加权朴素贝叶斯","authors":"Huan Zhang , Kexin Meng , Pei Lv , Shuo He , Mingliang Xu","doi":"10.1016/j.patcog.2025.112181","DOIUrl":null,"url":null,"abstract":"<div><div>Instance weighting is an effective and flexible method to alleviate the attribute conditional independence assumption in naive Bayes (NB). However, existing instance weighting methods mainly focus on how to learn a specific weight for each instance, ignoring the limitation of the original view. In this study, we argue that real-world applications are rather complicated, and it is sub-optimal to learn instance weights only using the original view. Based on this premise, we propose a novel general framework called dual-view instance weighted naive Bayes (DIWNB). In DIWNB, we first construct multiple K-nearest neighbor (KNN) classifiers and select those with the lowest error rate to classify each training instance in turn to build the generated view. Next, we learn a specific weight for each training instance, and build an instance weighted NB model in each view. Finally, we weightedly fuse the class-membership probabilities of dual views to predict the class label for each test instance. To construct the generated view, we design a hard label approach and a soft label approach, and thus two different versions are created, which we denote as DIWNB<span><math><msup><mrow></mrow><mrow><mi>H</mi></mrow></msup></math></span> and DIWNB<span><math><msup><mrow></mrow><mrow><mi>S</mi></mrow></msup></math></span>, respectively. Experimental results on 60 benchmark and 2 real-world datasets demonstrate the effectiveness of DIWNB.</div></div>","PeriodicalId":49713,"journal":{"name":"Pattern Recognition","volume":"171 ","pages":"Article 112181"},"PeriodicalIF":7.6000,"publicationDate":"2025-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A general dual-view framework for instance weighted naive Bayes\",\"authors\":\"Huan Zhang , Kexin Meng , Pei Lv , Shuo He , Mingliang Xu\",\"doi\":\"10.1016/j.patcog.2025.112181\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Instance weighting is an effective and flexible method to alleviate the attribute conditional independence assumption in naive Bayes (NB). However, existing instance weighting methods mainly focus on how to learn a specific weight for each instance, ignoring the limitation of the original view. In this study, we argue that real-world applications are rather complicated, and it is sub-optimal to learn instance weights only using the original view. Based on this premise, we propose a novel general framework called dual-view instance weighted naive Bayes (DIWNB). In DIWNB, we first construct multiple K-nearest neighbor (KNN) classifiers and select those with the lowest error rate to classify each training instance in turn to build the generated view. Next, we learn a specific weight for each training instance, and build an instance weighted NB model in each view. Finally, we weightedly fuse the class-membership probabilities of dual views to predict the class label for each test instance. To construct the generated view, we design a hard label approach and a soft label approach, and thus two different versions are created, which we denote as DIWNB<span><math><msup><mrow></mrow><mrow><mi>H</mi></mrow></msup></math></span> and DIWNB<span><math><msup><mrow></mrow><mrow><mi>S</mi></mrow></msup></math></span>, respectively. Experimental results on 60 benchmark and 2 real-world datasets demonstrate the effectiveness of DIWNB.</div></div>\",\"PeriodicalId\":49713,\"journal\":{\"name\":\"Pattern Recognition\",\"volume\":\"171 \",\"pages\":\"Article 112181\"},\"PeriodicalIF\":7.6000,\"publicationDate\":\"2025-07-25\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Pattern Recognition\",\"FirstCategoryId\":\"94\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S0031320325008428\",\"RegionNum\":1,\"RegionCategory\":\"计算机科学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Pattern Recognition","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0031320325008428","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
A general dual-view framework for instance weighted naive Bayes
Instance weighting is an effective and flexible method to alleviate the attribute conditional independence assumption in naive Bayes (NB). However, existing instance weighting methods mainly focus on how to learn a specific weight for each instance, ignoring the limitation of the original view. In this study, we argue that real-world applications are rather complicated, and it is sub-optimal to learn instance weights only using the original view. Based on this premise, we propose a novel general framework called dual-view instance weighted naive Bayes (DIWNB). In DIWNB, we first construct multiple K-nearest neighbor (KNN) classifiers and select those with the lowest error rate to classify each training instance in turn to build the generated view. Next, we learn a specific weight for each training instance, and build an instance weighted NB model in each view. Finally, we weightedly fuse the class-membership probabilities of dual views to predict the class label for each test instance. To construct the generated view, we design a hard label approach and a soft label approach, and thus two different versions are created, which we denote as DIWNB and DIWNB, respectively. Experimental results on 60 benchmark and 2 real-world datasets demonstrate the effectiveness of DIWNB.
期刊介绍:
The field of Pattern Recognition is both mature and rapidly evolving, playing a crucial role in various related fields such as computer vision, image processing, text analysis, and neural networks. It closely intersects with machine learning and is being applied in emerging areas like biometrics, bioinformatics, multimedia data analysis, and data science. The journal Pattern Recognition, established half a century ago during the early days of computer science, has since grown significantly in scope and influence.