Pooya Tavallali, P. Tavallali, M. Khosravi, M. Singhal
{"title":"可解释综合约简最近邻:一种期望最大化方法","authors":"Pooya Tavallali, P. Tavallali, M. Khosravi, M. Singhal","doi":"10.1109/ICIP40778.2020.9190986","DOIUrl":null,"url":null,"abstract":"Synthetic Reduced Nearest Neighbor (SRNN) is a Nearest Neighbor model which is constrained to have K synthetic samples (prototypes/centroids). There has been little attempt toward direct optimization and interpretability of SRNN with proper guarantees like convergence. To tackle these issues, this paper, inspired by K-means algorithm, provides a novel optimization of Synthetic Reduced Nearest Neighbor based on Expectation Maximization (EM-SRNN) that always converges while also monotonically decreases the objective function. The optimization consists of iterating over the centroids of the model and assignment of training samples to centroids. The EM-SRNN is interpretable since the centroids represent sub-clusters of the classes. Such type of interpretability is suitable for various studies such as image processing and epidemiological studies. In this paper, analytical aspects of problem are explored and linear complexity of optimization over the trainset is shown. Finally, EM-SRNN is shown to have superior or similar performance when compared with several other interpretable and similar state-of-the-art models such trees and kernel SVMs.","PeriodicalId":405734,"journal":{"name":"2020 IEEE International Conference on Image Processing (ICIP)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Interpretable Synthetic Reduced Nearest Neighbor: An Expectation Maximization Approach\",\"authors\":\"Pooya Tavallali, P. Tavallali, M. Khosravi, M. Singhal\",\"doi\":\"10.1109/ICIP40778.2020.9190986\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Synthetic Reduced Nearest Neighbor (SRNN) is a Nearest Neighbor model which is constrained to have K synthetic samples (prototypes/centroids). There has been little attempt toward direct optimization and interpretability of SRNN with proper guarantees like convergence. To tackle these issues, this paper, inspired by K-means algorithm, provides a novel optimization of Synthetic Reduced Nearest Neighbor based on Expectation Maximization (EM-SRNN) that always converges while also monotonically decreases the objective function. The optimization consists of iterating over the centroids of the model and assignment of training samples to centroids. The EM-SRNN is interpretable since the centroids represent sub-clusters of the classes. Such type of interpretability is suitable for various studies such as image processing and epidemiological studies. In this paper, analytical aspects of problem are explored and linear complexity of optimization over the trainset is shown. Finally, EM-SRNN is shown to have superior or similar performance when compared with several other interpretable and similar state-of-the-art models such trees and kernel SVMs.\",\"PeriodicalId\":405734,\"journal\":{\"name\":\"2020 IEEE International Conference on Image Processing (ICIP)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-10-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2020 IEEE International Conference on Image Processing (ICIP)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICIP40778.2020.9190986\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 IEEE International Conference on Image Processing (ICIP)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICIP40778.2020.9190986","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Interpretable Synthetic Reduced Nearest Neighbor: An Expectation Maximization Approach
Synthetic Reduced Nearest Neighbor (SRNN) is a Nearest Neighbor model which is constrained to have K synthetic samples (prototypes/centroids). There has been little attempt toward direct optimization and interpretability of SRNN with proper guarantees like convergence. To tackle these issues, this paper, inspired by K-means algorithm, provides a novel optimization of Synthetic Reduced Nearest Neighbor based on Expectation Maximization (EM-SRNN) that always converges while also monotonically decreases the objective function. The optimization consists of iterating over the centroids of the model and assignment of training samples to centroids. The EM-SRNN is interpretable since the centroids represent sub-clusters of the classes. Such type of interpretability is suitable for various studies such as image processing and epidemiological studies. In this paper, analytical aspects of problem are explored and linear complexity of optimization over the trainset is shown. Finally, EM-SRNN is shown to have superior or similar performance when compared with several other interpretable and similar state-of-the-art models such trees and kernel SVMs.