{"title":"教程1:神经网络和支持向量机","authors":"C. Chandra Sekhar, S. Thamarai Selvi, C. N. Rao","doi":"10.1109/ICOAC.2011.6165238","DOIUrl":null,"url":null,"abstract":"Discriminative learning based approaches to pattern classification are important for tasks that involve nonlinearly separable classes and overlapping classes as in many real world pattern classification tasks. Multilayer feedforward neural networks built using the computational models of neurons with sigmoidal activation function, and trained using the error back propagation learning algorithm have been explored for complex pattern classification tasks. The main limitations of theses models are the slow convergence of the learning method, the local minima problem and the poor generalization ability of trained models. Support vector machines overcome these limitations by using the principles of kernel methods. In the kernel methods for pattern classification, the first stage involves nonlinear transformation of representation of an example in a low dimensional input feature space to a representation in a high dimensional feature space induced by a kernel function, so that the nonlinearly separable classes in the input feature space are likely be linearly separable classes in the kernel feature space. The second stage in the kernel methods involves constructing an optimal linear solution in the kernel feature space that corresponds to an optimal nonlinear solution in the input feature space. The main advantages of the support vector machines are the good generalization ability and the requirement of small size training data sets. The main issue in the design of support vector machines is the choice of kernel function that induces the nonlinear transformation. Support vector machines can be used for vectorial representations of data as well as for non-vectorial representations of data, whereas multilayer feed forward neural networks can be used mainly for vectorial representations of data. The tutorial presents the underlying principles of approaches to pattern classification using multilayer feedforward neural networks and support vector machines, and discusses the issues in developing pattern classification models using these approaches. Some applications of support vector machines to pattern classification tasks in speech and image processing are also presented.","PeriodicalId":369712,"journal":{"name":"2011 Third International Conference on Advanced Computing","volume":"4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Tutorial I: Neural networks and support vector machines\",\"authors\":\"C. Chandra Sekhar, S. Thamarai Selvi, C. N. Rao\",\"doi\":\"10.1109/ICOAC.2011.6165238\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Discriminative learning based approaches to pattern classification are important for tasks that involve nonlinearly separable classes and overlapping classes as in many real world pattern classification tasks. Multilayer feedforward neural networks built using the computational models of neurons with sigmoidal activation function, and trained using the error back propagation learning algorithm have been explored for complex pattern classification tasks. The main limitations of theses models are the slow convergence of the learning method, the local minima problem and the poor generalization ability of trained models. Support vector machines overcome these limitations by using the principles of kernel methods. In the kernel methods for pattern classification, the first stage involves nonlinear transformation of representation of an example in a low dimensional input feature space to a representation in a high dimensional feature space induced by a kernel function, so that the nonlinearly separable classes in the input feature space are likely be linearly separable classes in the kernel feature space. The second stage in the kernel methods involves constructing an optimal linear solution in the kernel feature space that corresponds to an optimal nonlinear solution in the input feature space. The main advantages of the support vector machines are the good generalization ability and the requirement of small size training data sets. The main issue in the design of support vector machines is the choice of kernel function that induces the nonlinear transformation. Support vector machines can be used for vectorial representations of data as well as for non-vectorial representations of data, whereas multilayer feed forward neural networks can be used mainly for vectorial representations of data. The tutorial presents the underlying principles of approaches to pattern classification using multilayer feedforward neural networks and support vector machines, and discusses the issues in developing pattern classification models using these approaches. Some applications of support vector machines to pattern classification tasks in speech and image processing are also presented.\",\"PeriodicalId\":369712,\"journal\":{\"name\":\"2011 Third International Conference on Advanced Computing\",\"volume\":\"4 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2011-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2011 Third International Conference on Advanced Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/ICOAC.2011.6165238\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 Third International Conference on Advanced Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICOAC.2011.6165238","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Tutorial I: Neural networks and support vector machines
Discriminative learning based approaches to pattern classification are important for tasks that involve nonlinearly separable classes and overlapping classes as in many real world pattern classification tasks. Multilayer feedforward neural networks built using the computational models of neurons with sigmoidal activation function, and trained using the error back propagation learning algorithm have been explored for complex pattern classification tasks. The main limitations of theses models are the slow convergence of the learning method, the local minima problem and the poor generalization ability of trained models. Support vector machines overcome these limitations by using the principles of kernel methods. In the kernel methods for pattern classification, the first stage involves nonlinear transformation of representation of an example in a low dimensional input feature space to a representation in a high dimensional feature space induced by a kernel function, so that the nonlinearly separable classes in the input feature space are likely be linearly separable classes in the kernel feature space. The second stage in the kernel methods involves constructing an optimal linear solution in the kernel feature space that corresponds to an optimal nonlinear solution in the input feature space. The main advantages of the support vector machines are the good generalization ability and the requirement of small size training data sets. The main issue in the design of support vector machines is the choice of kernel function that induces the nonlinear transformation. Support vector machines can be used for vectorial representations of data as well as for non-vectorial representations of data, whereas multilayer feed forward neural networks can be used mainly for vectorial representations of data. The tutorial presents the underlying principles of approaches to pattern classification using multilayer feedforward neural networks and support vector machines, and discusses the issues in developing pattern classification models using these approaches. Some applications of support vector machines to pattern classification tasks in speech and image processing are also presented.