{"title":"Multi-resolution support vector machine","authors":"Xuhui Shao, V. Cherkassky","doi":"10.1109/IJCNN.1999.831103","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831103","url":null,"abstract":"The support vector machine (SVM) is a new learning methodology based on Vapnik-Chervonenkis (VC) theory (Vapnik, 1982, 1995). SVM has recently attracted growing research interest due to its ability to learn classification and regression tasks with high-dimensional data. The SVM formulation uses kernel representation. The existing algorithm leaves the choice of the kernel type and kernel parameters to the user. This paper describes an important extension to the SVM method: the multiresolution SVM (M-SVM) in which several kernels of different scales can be used simultaneously to approximate the target function. The proposed M-SVM approach enables 'automatic' selection of the 'optimal' kernel width. This usually results in better prediction accuracy of SVM models.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"134 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127488768","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Improved CBP learning with output bias decomposition","authors":"M. Lehtokangas","doi":"10.1109/IJCNN.1999.832632","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.832632","url":null,"abstract":"Choosing a network size is a difficult problem in neural network modelling. In many recent studies constructive or destructive methods that add or delete connections, neurons, layers have been studied for solving this problem In this work we consider the constructive approach. In particular we address the construction of feedforward networks by the use of improved constructive backpropagation that utilizes output bias decomposition scheme. The proposed improved scheme is shown to be beneficial especially in regression type problems like time series modelling. Namely, our time series prediction experiments demonstrate that the improved method is competitive in terms of modelling performance and training time compared to the well known cascade-correlation method.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127506327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The learning behavior of single neuron classifiers on linearly separable or nonseparable input","authors":"M. Basu, T. Ho","doi":"10.1109/IJCNN.1999.831142","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831142","url":null,"abstract":"Determining linear separability is an important way of understanding structures present in data. We explore the behavior of several classical descent procedures for determining linear separability and training linear classifiers in the presence of linearly nonseparable input. We compare the adaptive procedures to linear programming methods using many pairwise discrimination problems from a public database. We found that the adaptive procedures have serious implementation problems which make them less preferable than linear programming.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124817205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial life system for optimization of nonconvex functions","authors":"T. Satoh, A. Uchibori, Kanya Tanaka","doi":"10.1109/IJCNN.1999.833441","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.833441","url":null,"abstract":"This paper presents a distributed algorithm for optimization of nonconvex multimodal functions. In recent years, new distributed algorithms based on artificial life (ALife) system has been studied and its potential power has been demonstrated. In this paper, the frame work of ALife system is employed into a function minimization. We also propose a hybrid algorithm in which ALife system is incorporated with the local search method for finding good start points for the local search. Since the proposed method utilizes no gradient information it can be applied to very wide class of optimization problems. The effectiveness of the proposed method is demonstrated through some numerical tests.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124857384","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Hyung-Soo Hwang, Jeoung-Nae Choi, Won-Hyok Lee, Jin-Kwon Kim
{"title":"A tuning algorithm for the PID controller utilizing fuzzy theory","authors":"Hyung-Soo Hwang, Jeoung-Nae Choi, Won-Hyok Lee, Jin-Kwon Kim","doi":"10.1109/IJCNN.1999.833404","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.833404","url":null,"abstract":"In this paper, we proposed a new PID tuning algorithm by the fuzzy set theory to improve the performance of the PID controller. The new tuning algorithm for the PID controller has the initial value of parameter Kp, /spl tau//sub I/, /spl tau//sub D/, by the Ziegler-Nichols formula (1942) that uses the ultimate gain and ultimate period from a relay tuning experiment. We will get the error and the error rate of plant output corresponding to the initial value of parameter and find the new proportion gain (Kp) and the integral time (/spl tau//sub I/) from fuzzy tuner by the error and error rate of plant output as a membership function of fuzzy theory. This fuzzy auto tuning algorithm for PID controller considerably reduced the overshoot and rise time as compared to any other PID controller tuning algorithms. And in real parametric uncertainty systems, it constitutes an appreciable improvement of performance. The significant properties of this algorithm is shown by simulation.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"3 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125061896","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Image segmentation based on motion/luminance integration and oscillatory correlation","authors":"E. Çesmeli, Deliang Wang","doi":"10.1109/IJCNN.1999.833510","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.833510","url":null,"abstract":"An image segmentation method is proposed based on the integration of motion and luminance information. The method is composed of two parallel pathways that process motion and luminance, respectively. Inspired by the visual system, the motion pathway has two stages. The first stage estimates local motion at locations with reliable information The second stage groups locations based on their motion estimates. In the parallel pathway, the input scene is segmented based on luminance. In the subsequent integration stage, motion estimates are refined to obtain the final segmentation result in the motion pathway. For segmentation, LEGION (Locally Excitatory Globally Inhibitory Oscillator Networks) is employed whereby the phases of oscillators are used for region labeling. Results on synthetic and real image sequences are provided.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126127509","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Predictive head tracking for virtual reality","authors":"Emad W. Saad, T. Caudell, D. Wunsch","doi":"10.1109/IJCNN.1999.830785","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.830785","url":null,"abstract":"In virtual reality (VR), head movement is tracked through inertial and optical sensors. Computation and communication times result in delays between measurements and updating of the new frame in the head mounted display(HMD). These delays result in problems, including motion sickness. We use recurrent and time delay neural networks to predict the head location and use it to calculate the new frame. A predictability analysis is used in designing the prediction system.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"49 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126151239","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Han-Go Choi, Jae-Heung Cho, Sang-Hee Kim, Sang-Jae Lee
{"title":"Recognition of unconstrained handwritten digits using modified chaotic neural networks","authors":"Han-Go Choi, Jae-Heung Cho, Sang-Hee Kim, Sang-Jae Lee","doi":"10.1109/IJCNN.1999.833549","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.833549","url":null,"abstract":"This paper describes an off-line method for recognizing totally unconstrained handwritten digits using modified chaotic neural networks (CNN). Since the CNN has inherently the characteristics of highly nonlinear dynamics it can be an appropriate network for the robust classification of complex patterns. The CNN in this paper is trained by the error backpropagation algorithm. Digit identification starts with extraction of features from the raw digit images and then recognizes digits using the CNN based classifier The performance of the CNN classifier is evaluated on the Concordia database. For the relative comparison of recognition performance the CNN classifier is compared with the recurrent neural networks (RNN) classifier Experimental results show that the classification rate is 98.4%. It indicates that the CNN classifier outperforms the RNN classifier as well as other classifiers that have been reported on the same database.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123581225","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A numerical exploration of a stochastic model of human list learning","authors":"K. Wilson, C. Osborne","doi":"10.1109/IJCNN.1999.831451","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831451","url":null,"abstract":"A simple network which shows primacy and recency effects is presented. The model uses stochastic updating of clipped weights to produce a range of different memory behaviours. The model, originally proposed by Kahn, Wong and Shewington (1991, 1995), shows a much wider range of behaviours than originally predicted. These behaviours depend on the probability of updating weights, initial non-zero weights, type and degree of dilution.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125525358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A local face statistics recognition methodology beyond ICA and/or PCA","authors":"Annie Xin Guan, H. Szu","doi":"10.1109/IJCNN.1999.831094","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831094","url":null,"abstract":"We have reviewed the independent component analysis (ICA), as an unsupervised ANN learning algorithm for redundancy reduction and feature extraction, and compared its performance with the classical principal component analysis (PCA) of face images, known as \"eigenfaces\". Based on our experiments, we believe that with PCA and ICA representations, a promising 85% to 95% PD with approximately 5% to 10% FAR in the ROC experiments might be achieved for a closed library set of persons, each of which has different profiles and lightening expressions. ICA encodes face images with statistically independent variables, which are not necessarily associated with the orthogonal axes, while PCA is always associated with orthogonal eigenvectors. Sometimes, the projections onto the ICA non-orthogonal axes are above the recognition threshold while the projections upon the orthogonal PCA axes are under the threshold However, both these pixel-based statistical processing algorithms have their drawbacks. The major one is that they weight the whole face equally and therefore lack the local geometry information. We argue that a fully robust face recognition or pattern recognition system should take both the gestalt geometry principle and the individual statistical features into account, i.e. it should approach from both statistical and geometry perspectives. An efficient way to implement both is the local or regional statistics, which may be called the local ICA or local PCA.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114953994","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}