Sanjay S. P. Rattan, W. W. Hsieh, Columbia Vancouver, B. Ruessink
{"title":"Nonlinear complex principal component analysis and its applications","authors":"Sanjay S. P. Rattan, W. W. Hsieh, Columbia Vancouver, B. Ruessink","doi":"10.1109/IJCNN.2005.1556122","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556122","url":null,"abstract":"Complex principal component analysis (CPCA) is a linear multivariate technique commonly applied to complex variables or 2D vector fields such as winds or currents. A new nonlinear CPCA (NLCPCA) method has been developed via complex-valued multi-layer perceptron neural networks. NLCPCA is applied to the tropical Pacific wind field to study the interannual variability. Compared to the CPCA mode 1, the NLCPCA mode 1 is found to explain more variance and reveal the asymmetry in the wind anomalies between warm (El Nino) and cool (La Nina) states. NLCPCA can also be used to nonlinearly generalize Hilbert PCA (where real data is complexified prior to performing CPCA). An example is provided from the nearshore bathymetry at Egmond, Netherlands, where sand bars propagate offshore, and unlike the CPCA mode 1, the NLCPCA mode 1 detects asymmetry between the bars and the troughs.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122098690","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural network-based analysis of DNA microarray data","authors":"J. Patra, Lei Wang, Ee-Luang Ang, N.S. Chaudhari","doi":"10.1109/IJCNN.2005.1555882","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1555882","url":null,"abstract":"The analysis of DNA microarray expression data has become an important subject in bioinformatics. Scientists have adopted different approaches to select the informative genes those can distinguish different types of cancers. In this paper, we show the use of a dimension reduction technique such as singular value decomposition (SVD) to capture the genes with similar patterns. We propose a novel method of selection of feature genes based on information loss using SVD. To assign the samples to known classes, we design a multi-layer perceptron-based classifier with reduced dimensional input vectors. We provide performance comparison between different selection methods in terms of classification rate","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122139343","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"GETnet: a general framework for evolutionary temporal neural networks","authors":"R. Derakhshani","doi":"10.1109/IJCNN.2005.1556431","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556431","url":null,"abstract":"Among the more challenging problems in the design of temporal neural networks are the incorporation of short and long-term memories and the choice of network topology. Delayed copies of network signals can form short-term memory (STM), whereas feedback loops can constitute long-term memories (LTM). This paper introduces a new general evolutionary temporal neural network framework (GETnet) for the automated design of neural networks with distributed STM and LTM. GETnet is a step towards the realization of general intelligent systems that can be applied to a broad range of problems. GETnet utilizes nonlinear moving average and autoregressive nodes and sub-circuits that are trained by enhanced gradient descent and evolutionary search in architecture, synaptic delay, and synaptic weight spaces. The ability to evolve arbitrary time-delay connections enables GETnet to find novel answers to classification and system identification tasks. A new temporal minimum description length policy ensures creation of fast and compact networks with improved generalization capabilities. Simulations using Mackey-Glass time series are presented to demonstrate the above stated capabilities of GETnet.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"171 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125780196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Modular general fuzzy hyperline segment neural network","authors":"P. Patil, M. Deshmukh","doi":"10.1109/IJCNN.2005.1556172","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556172","url":null,"abstract":"This paper describes modular general fuzzy hyperline segment neural network (MGFHLSNN) with its learning algorithm, which is an extension of general fuzzy hyperline segment neural network (GFHLSNN) proposed by Patil, Kulkarni and Sontakke (2002) that combines supervised and unsupervised learning in a single algorithm so that it can be used for pure classification, pure clustering and hybrid classification/clustering. MGFHLSNN offers higher degree of parallelism since each module is exposed to the patterns of only one class and trained without overlap test and removal, unlike in fuzzy hyperline segment neural network (FHLSNN) by U.V. Kulkami et al. (2001) leading to reduction in training time. In proposed algorithm each module captures peculiarity of only one particular class and found superior in terms of generalization and training time with equivalent testing time. Thus, it can be used for voluminous realistic database, where new patterns can be added on fly.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129789830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Maximal variation and missing values for componentwise support vector machines","authors":"K. Pelckmans, J. Suykens, B. Moor, J. Brabanter","doi":"10.1109/IJCNN.2005.1556371","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556371","url":null,"abstract":"This paper proposes primal-dual kernel machine classifiers based on worst-case analysis of a finite set of observations including missing values of the inputs. Key ingredients are the use of a componentwise support vector machine (cSVM) and an empirical measure of maximal variation of the components to bind the influence of the component which cannot be evaluated due to missing values. A regularization term based on the L/sub 1/ norm of the maximal variation is used to obtain a mechanism for structure detection in that context. An efficient implementation using the hierarchical kernel machines framework is elaborated.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128709757","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Feature selection by independent component analysis and mutual information maximization in EEG signal classification","authors":"Tian Lan, Deniz Erdoğmuş, A. Adami, M. Pavel","doi":"10.1109/IJCNN.2005.1556405","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556405","url":null,"abstract":"Feature selection and dimensionality reduction are important steps in pattern recognition. In this paper, we propose a scheme for feature selection using linear independent component analysis and mutual information maximization method. The method is theoretically motivated by the fact that the classification error rate is related to the mutual information between the feature vectors and the class labels. The feasibility of the principle is illustrated on a synthetic dataset and its performance is demonstrated using EEG signal classification. Experimental results show that this method works well for feature selection.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129621127","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fisher information quantifies task-specific performance in the blowfly photoreceptor","authors":"Peng Xu, P. Abshire","doi":"10.1109/IJCNN.2005.1555842","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1555842","url":null,"abstract":"Performance on specific tasks in an organism's everyday activities is essential to survival. In this paper, we extend information-theoretic investigation of neural systems to task specific information using a detailed biophysical model of the blowfly photoreceptor. We formulate the response of the photoreceptor to incident flashes and determine the optimal detection performance using ideal observer analysis. Furthermore, we derive Fisher information contained in the output of the photoreceptor, and show how Fisher information is related to the detection performance. In addition we use Fisher information to show the connections between detection performance, signal-noise ratio, and discriminability. Our detailed biophysical model of the blowfly photoreceptor provides a rich framework for information-theoretic study of neural systems.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127541359","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
T. Markiewicz, S. Osowski, B. Marianska, L. Moszczynski
{"title":"Automatic recognition of the blood cells of myelogenous leukemia using SVM","authors":"T. Markiewicz, S. Osowski, B. Marianska, L. Moszczynski","doi":"10.1109/IJCNN.2005.1556295","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556295","url":null,"abstract":"The paper presents the system for automatic recognition of the leukemia blast cells on the basis of the image of the bone marrow aspirate. The recognizing system uses support vector machine (SVM) as the classifier and exploits the features of the image of the blood cells related to the texture, geometry and histograms. The results presented in the paper are concerned with the features generation and selection in order to get the best results of recognition. The results of numerical experiments of recognition of 17 classes of blood cells of myelogenous leukemia are presented and discussed.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"67 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127044379","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial neural networks for temporal processing applied to prediction of electric energy in small hydroelectric power stations","authors":"P. Joaquim, J. Rosa","doi":"10.1109/IJCNN.2005.1556317","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556317","url":null,"abstract":"The purpose of this work is to present a computational prediction of temporal series through artificial neural networks (ANN) with temporal features based on short-term memory structures and episodic long-term memory. The connectionist prediction is applied to a Brazilian small hydroelectric power station, with generation capacity of 15 MWh, because conventional prediction statistical techniques show inadequacy in relation to noise, acquisition fails, and need for generalization, when applied to this model. Departing from the proposed system, it is intended also to develop, in the future, a non-linear complex system, employing ANNs, with the inclusion of new variables in the decision process, in addition to the episodic memory model, which is considered computationally feasible with the current available resources.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"2008 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127319149","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Intelligent systems for meteorological events forecast","authors":"E. Pasero, W. Moniaci, T. Meindl","doi":"10.1109/IJCNN.2005.1556348","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556348","url":null,"abstract":"In this paper a committee of \"intelligent systems\" evaluates the occurrence of meteorological phenomena. Rain and fog are the events which are considered. The forecast system is based on a multinetwork approach which evaluates data coming from electronic sensors and from satellite observations. More data and more engines are used to increase the reliability of the event prediction. The increased complexity of the global system requires more data coming from different sources but gives a good reliability.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"45 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131026323","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}