模式识别的新核与核主成分分析

J. Isaacs, S. Foo, A. Meyer-Bäse
{"title":"模式识别的新核与核主成分分析","authors":"J. Isaacs, S. Foo, A. Meyer-Bäse","doi":"10.1109/CIRA.2007.382927","DOIUrl":null,"url":null,"abstract":"Kernel methods are a mathematical tool that provides a generally higher dimensional representation of given data set in feature space for feature recognition and image analysis problems. Typically, the kernel trick is thought of as a method for converting a linear classification learning algorithm into non-linear one, by mapping the original observations into a higher-dimensional non-linear space so that linear classification in the new space is equivalent to non-linear classification in the original space. Moreover, optimal kernels can be designed to capture the natural variation present in the data. In this paper we present the performance results of fifteen novel kernel functions and their respective performance for kernel principal component analysis on five select databases. Empirical results show that our kernels perform as well and better than existing kernels on these databases.","PeriodicalId":301626,"journal":{"name":"2007 International Symposium on Computational Intelligence in Robotics and Automation","volume":"107 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-06-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"11","resultStr":"{\"title\":\"Novel Kernels and Kernel PCA for Pattern Recognition\",\"authors\":\"J. Isaacs, S. Foo, A. Meyer-Bäse\",\"doi\":\"10.1109/CIRA.2007.382927\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Kernel methods are a mathematical tool that provides a generally higher dimensional representation of given data set in feature space for feature recognition and image analysis problems. Typically, the kernel trick is thought of as a method for converting a linear classification learning algorithm into non-linear one, by mapping the original observations into a higher-dimensional non-linear space so that linear classification in the new space is equivalent to non-linear classification in the original space. Moreover, optimal kernels can be designed to capture the natural variation present in the data. In this paper we present the performance results of fifteen novel kernel functions and their respective performance for kernel principal component analysis on five select databases. Empirical results show that our kernels perform as well and better than existing kernels on these databases.\",\"PeriodicalId\":301626,\"journal\":{\"name\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"volume\":\"107 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2007-06-20\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"11\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2007 International Symposium on Computational Intelligence in Robotics and Automation\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/CIRA.2007.382927\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Symposium on Computational Intelligence in Robotics and Automation","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIRA.2007.382927","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 11

摘要

核方法是一种数学工具,它为特征识别和图像分析问题提供了在特征空间中给定数据集的一般高维表示。通常,核技巧被认为是一种将线性分类学习算法转换为非线性学习算法的方法,通过将原始观测值映射到高维非线性空间中,使新空间中的线性分类等同于原始空间中的非线性分类。此外,可以设计最优核来捕获数据中存在的自然变化。在本文中,我们给出了15种新型核函数的性能结果以及它们各自在5个选定数据库上进行核主成分分析的性能。实验结果表明,我们的内核在这些数据库上的性能与现有的内核一样好,甚至更好。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Novel Kernels and Kernel PCA for Pattern Recognition
Kernel methods are a mathematical tool that provides a generally higher dimensional representation of given data set in feature space for feature recognition and image analysis problems. Typically, the kernel trick is thought of as a method for converting a linear classification learning algorithm into non-linear one, by mapping the original observations into a higher-dimensional non-linear space so that linear classification in the new space is equivalent to non-linear classification in the original space. Moreover, optimal kernels can be designed to capture the natural variation present in the data. In this paper we present the performance results of fifteen novel kernel functions and their respective performance for kernel principal component analysis on five select databases. Empirical results show that our kernels perform as well and better than existing kernels on these databases.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信