{"title":"Ability to skip steps emerging from chaotic dynamics","authors":"Luciana P. P. Bueno, A. Araujo","doi":"10.1109/IJCNN.2005.1555944","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1555944","url":null,"abstract":"A chaotic bidirectional memory model (C-BAM) is constructed through the inclusion of chaotic neurons in the original BAM. Empiric experiments showed the occurrence of a chaotic dynamic capable to generate large diversity of recalled patterns involving complex excursions over all stored memories. This suggested that the retrieval sequence can model the ability of a novice or the ability of an expert to execute a task. Moreover, the paper illustrates a case in which a novice recall can be transformed into an expert recall through parametric variation.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121429241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. Peralta, G. Anagnostopoulos, E. Gómez-Sánchez, S. Richie
{"title":"On the design of an ellipsoid ARTMAP classifier within the fuzzy adaptive system ART framework","authors":"R. Peralta, G. Anagnostopoulos, E. Gómez-Sánchez, S. Richie","doi":"10.1109/IJCNN.2005.1555876","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1555876","url":null,"abstract":"In this paper we present the design of fuzzy adaptive system ellipsoid ARTMAP (FASEAM), a novel neural architecture based on ellipsoid ARTMAP (EAM) that is equipped with concepts utilized in the fuzzy adaptive system ART (FASART) architecture. More specifically, we derive a new category choice function appropriate for EAM categories that is non-constant in a category's representation region. Additionally, we augment the EAM category description with a centroid vector, whose learning rate is inversely proportional to the number of training patterns accessing the category. Finally, we demonstrate the merits of our design choices by comparing FASART, EAM and FASEAM in terms of generalization performance and final structural complexity on a set of classification problems.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"22 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123075422","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning of an XOR problem in the presence of noise and redundancy","authors":"D. Cousineau","doi":"10.1109/IJCNN.2005.1556226","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556226","url":null,"abstract":"Recently introduced time-based networks represent an alternative to the usual strength-based networks. In this paper, we compare two instances of each family of networks that are of comparable complexity, the perceptron and the race network when faced with uncertain input. Uncertainty was manipulated in two different ways, within channel by adding noise and between channels by adding redundant inputs. For the perceptron, results indicate that if noise is high, redundancy must be low (or vice versa), otherwise learning does not occur. For the race network, the opposite is true: if both noise and redundancy increase, learning remains both fast and reliable. Asymptotic statistic theories suggest that these results may be true of all the networks belonging to these two families. Thus, redundancy is a non trivial factor","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123104427","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fast pattern detection using neural networks and cross correlation in the frequency domain","authors":"H. El-Bakry, Qiangfu Zhao","doi":"10.1109/IJCNN.2005.1556170","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556170","url":null,"abstract":"Recently, fast neural networks for object/face detection were presented in S. Ben-acoub et al. The speed up factor of these networks relies on performing cross correlation in the frequency domain between the input image and the weights of the hidden layer. But, these equations given in for conventional and fast neural networks are not valid for many reasons presented here. In this paper, correct equations for cross correlation in the spatial and frequency domains are presented. Furthermore, correct formulas for the number of computation steps required by conventional and fast neural networks given are introduced. A new formula for the speed up ratio is established. Also, corrections for the equations of fast multi scale object/face detection are given. Moreover, commutative cross correlation is achieved. Simulation results show that sub-image detection based on cross correlation in the frequency domain is faster than classical neural networks.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123110539","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Zhang Li-sun, De-shuang Huang, Chunhou Zheng, L. Shang
{"title":"Blind inversion of Wiener system for single source using nonlinear blind source separation","authors":"Zhang Li-sun, De-shuang Huang, Chunhou Zheng, L. Shang","doi":"10.1109/IJCNN.2005.1556030","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556030","url":null,"abstract":"In this paper, a nonlinear blind source separation system with post-nonlinear mixing; model, and an unsupervised learning algorithm for the parameters of this separating system are presented for blind inversion of Wiener system for single source. The proposed method firstly changes the deconvolution part of Wiener system into a special case of linear blind source separation (BSS). Then the nonlinear BSS system is applied to derive the source signal. The proposed nonlinear BSS method can dynamically estimate the nonlinearity of mixing model and adapt to the cumulative probability function (CPF) of sources. Finally, experimental results demonstrate that our proposed method is effective and efficient for the problems addressed.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114084226","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
W. J. Puma-Villanueva, C. Lima, E.P. dos Santos, F. von Zuben
{"title":"Mixture of heterogeneous experts applied to time series: a comparative study","authors":"W. J. Puma-Villanueva, C. Lima, E.P. dos Santos, F. von Zuben","doi":"10.1109/IJCNN.2005.1556017","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556017","url":null,"abstract":"Prediction models for time series generally include preprocessing followed by the synthesis of an input-output mapping. Neural network models have been adopted to perform both steps, by means of unsupervised and supervised learning, respectively. The flexibility and the generalization capability are the most relevant attributes in favor of connectionist approaches. However, even though time series prediction can be roughly interpreted as learning from data, high levels of performance will solely be achieved if some peculiarities of each time series are properly considered in the design, particularly the existence of trend and seasonality. Instead of directly adopting detrend and/or deseasonality treatments, this paper proposes a novel paradigm for supervised learning based on a mixture of heterogeneous experts. Some mixture models have already been proved to produce good performance as predictors, but the present approach is devoted to a hybrid mixture composed of a set of distinct experts. The purpose is not only to further explore the \"divide-and-conquer\" principle, but also to compare the performance of mixture of heterogeneous experts with the standard mixture of experts approach, using ten distinct time series. The obtained results indicate that mixture of heterogeneous experts generally requires a more elaborate gating device and performs better in the case of more challenging time series.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"8 5","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121000387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"fMRI experiments and computational models of the function of the prefrontal cortex and the basal ganglia: a review","authors":"O. Monchi","doi":"10.1109/IJCNN.2005.1556116","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556116","url":null,"abstract":"We have previously developed computational neuroscience models of fronto-striatal activity during the performance of the Wisconsin card sorting task (WCST), a well-known set- shifting task (Monchi et al., 1999 and 2000). The simulation of this model helped design a novel event-related functional magnetic resonance imaging fMRI protocol that allows for the separation of four temporal stages of the task, that was used in both healthy controls and patients with Parkinson's disease (PD) (Monchi et al., 2001 and 2004). Here, the advantages and limitations of our previous computational methods were discussed with respect to functional neuroimaging data acquisition, and examples of new, on-going studies using both fMRI and computational neuroscience were given.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116112649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fast Bayesian support vector machine parameter tuning with the Nystrom method","authors":"C. Gold, Peter Sollich","doi":"10.1109/IJCNN.2005.1556372","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556372","url":null,"abstract":"We experiment with speeding up a Bayesian method for tuning the hyperparameters of a support vector machine (SVM) classifier. The Bayesian approach gives the gradients of the evidence as averages over the posterior, which can be approximated using hybrid Monte Carlo simulation (HMC). By using the Nystrom approximation to the SVM kernel, our method significantly reduces the dimensionality of the space to be simulated in the HMC. We show that this speeds up the running time of the HMC simulation from O(n/sup 2/) (with a large prefactor) to effectively O(n), where n is the number of training samples. We conclude that the Nystrom approximation has an almost insignificant effect on the performance of the algorithm when compared to the full Bayesian method, and gives excellent performance in comparison with other approaches to hyperparameter tuning.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"26 3","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120996600","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning cycles brings chaos in Hopfield networks","authors":"C. Molter, U. Salihoglu, H. Bersini","doi":"10.1109/IJCNN.2005.1555949","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1555949","url":null,"abstract":"This paper aims at studying the impact of an Hebbian learning algorithm on the recurrent neural network's underlying dynamics. Two different kinds of learning are compared in order to encode information in the attractors of the Hopfield neural net: the storing of static patterns and the storing of cyclic patterns. We show that if the storing of static patterns leads to a reduction of the potential dynamics following the learning phase, the learning of cyclic patterns tends to increase the dimension of the potential attractors instead. In fact, such learning may be used as an extra \"route to chaos\": the more cycles to be learned, the more the network shows as spontaneous dynamics a form of chaotic itinerancy among brief oscillatory periods. These results are in line with the observations made by Freeman in the olfactory bulb of the rabbit: cycles are used to store information and the chaotic dynamics appears as the background regime composed of those cyclic \"memory bags\". It confirms precedent papers in which it was observed that huge encoding capacity in term of cyclic attractors implies strong presence of chaos.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"92 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123809302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A multiple BAM for hetero-association and multisensory integration modelling","authors":"E. Reynaud, H. Paugam-Moisy","doi":"10.1109/IJCNN.2005.1556227","DOIUrl":"https://doi.org/10.1109/IJCNN.2005.1556227","url":null,"abstract":"We present in this article a dynamic neural network that works as a memory for multiple associations. Heterogeneous pairs of patterns can be tied together through learning within this memory, and recalled easily. Starting from Kosko's bidirectional associative memory, we modify some fundamental features of the network (topology and learning algorithm). We show empirically that this network has a high storage capacity and is only weakly dependent upon learning hyperparameters. We demonstrate its robustness to corrupted or missing data. We finally present results from experiments where this network is used as a multisensory associative memory.","PeriodicalId":365690,"journal":{"name":"Proceedings. 2005 IEEE International Joint Conference on Neural Networks, 2005.","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2005-12-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125164331","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}