模拟元组类分配的子空间分类方法

Victor Robila, R. Haralick
{"title":"模拟元组类分配的子空间分类方法","authors":"Victor Robila, R. Haralick","doi":"10.1109/LISAT58403.2023.10179509","DOIUrl":null,"url":null,"abstract":"We developed a subspace classifier for measurement classification to provide an alternative to current deep learning approaches. Many modern neural networks cannot provide an understandable explanation for their classification. The subspace classifier provides a decomposition of the classification problem making it computationally simpler. We first use a Bayesian method in which all the class conditional probabilities for the entire measurement space can be stored in memory.Then we made experiments with simulated class conditional distributions and defined a subspace classifier that only stores the class conditional probabilities for the subspaces. This can use much larger distributions than the previous model as it uses much less memory so we expanded to cases where the measurement space is generated sequentially and everything does not have to be in the memory at the same time.For cases with distributions that fit in the memory we also compared a Bayesian approach with the subspace approach. The Bayesian subspace classifiers consistently outperformed the subspace classifiers without Bayes rule by a large margin. We also compare the subspace classifier with 3 Python Machine Learning Models, namely a Ridge Classifier, a Multi-Layer Perceptron (MLP) classifier (neural network), and a Support Vector Machine (SVM) on a set of tuples and class conditional probability distributions with 4 classes. The subspace classifier had an average probability of correct identification of 0.25172, the SVM model had an average accuracy of 0.20987, the neural network MLP classifier had an average accuracy of 0.2140 and the Ridge Classifier had an average accuracy of 0.2798 over 10,000 trials.","PeriodicalId":250536,"journal":{"name":"2023 IEEE Long Island Systems, Applications and Technology Conference (LISAT)","volume":"6 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-05-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Subspace-Classification Approach for Simulated Tuple Class Assignment\",\"authors\":\"Victor Robila, R. Haralick\",\"doi\":\"10.1109/LISAT58403.2023.10179509\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"We developed a subspace classifier for measurement classification to provide an alternative to current deep learning approaches. Many modern neural networks cannot provide an understandable explanation for their classification. The subspace classifier provides a decomposition of the classification problem making it computationally simpler. We first use a Bayesian method in which all the class conditional probabilities for the entire measurement space can be stored in memory.Then we made experiments with simulated class conditional distributions and defined a subspace classifier that only stores the class conditional probabilities for the subspaces. This can use much larger distributions than the previous model as it uses much less memory so we expanded to cases where the measurement space is generated sequentially and everything does not have to be in the memory at the same time.For cases with distributions that fit in the memory we also compared a Bayesian approach with the subspace approach. The Bayesian subspace classifiers consistently outperformed the subspace classifiers without Bayes rule by a large margin. We also compare the subspace classifier with 3 Python Machine Learning Models, namely a Ridge Classifier, a Multi-Layer Perceptron (MLP) classifier (neural network), and a Support Vector Machine (SVM) on a set of tuples and class conditional probability distributions with 4 classes. The subspace classifier had an average probability of correct identification of 0.25172, the SVM model had an average accuracy of 0.20987, the neural network MLP classifier had an average accuracy of 0.2140 and the Ridge Classifier had an average accuracy of 0.2798 over 10,000 trials.\",\"PeriodicalId\":250536,\"journal\":{\"name\":\"2023 IEEE Long Island Systems, Applications and Technology Conference (LISAT)\",\"volume\":\"6 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-05-05\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2023 IEEE Long Island Systems, Applications and Technology Conference (LISAT)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/LISAT58403.2023.10179509\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2023 IEEE Long Island Systems, Applications and Technology Conference (LISAT)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LISAT58403.2023.10179509","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

我们开发了一个子空间分类器用于测量分类,以提供当前深度学习方法的替代方案。许多现代神经网络无法为它们的分类提供一个可理解的解释。子空间分类器提供了分类问题的分解,使其计算更简单。我们首先使用贝叶斯方法,其中整个测量空间的所有类条件概率都可以存储在内存中。然后对模拟的类条件分布进行了实验,并定义了一个子空间分类器,该分类器只存储子空间的类条件概率。这可以使用比以前的模型更大的分布,因为它使用更少的内存,所以我们扩展到连续生成测量空间的情况,并且所有东西都不必同时在内存中。对于适合内存分布的情况,我们还比较了贝叶斯方法和子空间方法。贝叶斯子空间分类器的性能始终优于没有贝叶斯规则的子空间分类器。我们还将子空间分类器与3种Python机器学习模型进行了比较,即Ridge分类器,多层感知器(MLP)分类器(神经网络)和支持向量机(SVM)在一组元组和4个类的条件概率分布上。子空间分类器的平均正确识别概率为0.25172,SVM模型的平均准确率为0.20987,神经网络MLP分类器的平均准确率为0.2140,Ridge分类器的10000次试验的平均准确率为0.2798。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Subspace-Classification Approach for Simulated Tuple Class Assignment
We developed a subspace classifier for measurement classification to provide an alternative to current deep learning approaches. Many modern neural networks cannot provide an understandable explanation for their classification. The subspace classifier provides a decomposition of the classification problem making it computationally simpler. We first use a Bayesian method in which all the class conditional probabilities for the entire measurement space can be stored in memory.Then we made experiments with simulated class conditional distributions and defined a subspace classifier that only stores the class conditional probabilities for the subspaces. This can use much larger distributions than the previous model as it uses much less memory so we expanded to cases where the measurement space is generated sequentially and everything does not have to be in the memory at the same time.For cases with distributions that fit in the memory we also compared a Bayesian approach with the subspace approach. The Bayesian subspace classifiers consistently outperformed the subspace classifiers without Bayes rule by a large margin. We also compare the subspace classifier with 3 Python Machine Learning Models, namely a Ridge Classifier, a Multi-Layer Perceptron (MLP) classifier (neural network), and a Support Vector Machine (SVM) on a set of tuples and class conditional probability distributions with 4 classes. The subspace classifier had an average probability of correct identification of 0.25172, the SVM model had an average accuracy of 0.20987, the neural network MLP classifier had an average accuracy of 0.2140 and the Ridge Classifier had an average accuracy of 0.2798 over 10,000 trials.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信