Introduction to VC learning theory

V. Cherkassky
{"title":"Introduction to VC learning theory","authors":"V. Cherkassky","doi":"10.1109/CIFER.2000.844584","DOIUrl":null,"url":null,"abstract":"In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss:","PeriodicalId":308591,"journal":{"name":"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the IEEE/IAFE/INFORMS 2000 Conference on Computational Intelligence for Financial Engineering (CIFEr) (Cat. No.00TH8520)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/CIFER.2000.844584","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, there has been an explosive growth of methods for estimating (learning) dependencies from data. The learning methods have been developed in the fields of statistics, neural networks, signal processing, fizzy systems etc. These methods have a common goal of estimating unknown dependencies from available (historical) data (samples). Estimated dependencies are then used for accurate prediction of future data (generalization). Hence this problem is known as Predictive Learning. Statistical Learning Theory (aka VC-theory or VapnikChervonenkis theory) has recently emerged as a general conceptual and mathematical framework for estimating (learning) dependencies from finite samples. Unfortunately, perhaps because of its mathematical rigor and complexity, this theory is not well known in the financial engineering community. Hence, the purpose of this tutorial is to discuss:
VC学习理论简介
近年来,从数据中估计(学习)依赖关系的方法呈爆炸式增长。学习方法在统计学、神经网络、信号处理、气泡系统等领域得到了发展。这些方法都有一个共同的目标,即从可用(历史)数据(样本)中估计未知的依赖关系。估计的依赖关系然后用于对未来数据的准确预测(泛化)。因此这个问题被称为预测性学习。统计学习理论(又名vc理论或VapnikChervonenkis理论)最近作为估计(学习)有限样本的依赖关系的一般概念和数学框架而出现。不幸的是,也许是因为它的数学严谨性和复杂性,这个理论在金融工程界并不为人所知。因此,本教程的目的是讨论:
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信