R. A. Macêdo, Donato da Silva-Filho, D. Coury, A. Carvalho
{"title":"A new technique based on genetic algorithms for tracking of power system harmonics","authors":"R. A. Macêdo, Donato da Silva-Filho, D. Coury, A. Carvalho","doi":"10.1109/SBRN.2002.1181427","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181427","url":null,"abstract":"Voltage and current waveforms of a distribution or a transmission system are not pure sinusoids. There are distortions in these waveforms that consist of a combination of the fundamental frequency, harmonics and high frequency transients. This work presents an approach of a harmonic identification method for distorted waveforms in electric power systems. The proposed method is based on the genetic algorithm, which is a technique for optimization inspired by genetics and natural evolution. The proposed algorithm was tested with simulated data. The effect of the size of the initial population, the crossover rate and the mutation rate were studied. The results demonstrate that the method presented is precise when compared with the traditional Fourier transform.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129888608","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A feed forward neural network with resolution properties for function approximation and modeling","authors":"P. H. F. D. Silva, E. Fernandes, A. Neto","doi":"10.1109/SBRN.2002.1181435","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181435","url":null,"abstract":"This paper attempts to the development of a novel feed forward artificial neural network paradigm. In its formulation, the hidden neurons were defined by the use of sample activation functions. The following function parameters were included: amplitude, width and translation. Further, the hidden neurons were classified as low and high resolution neurons, with global and local approximation properties, respectively. The gradient method was applied to obtain simple recursive relations for paradigm training. The results of the applications shown the interesting paradigm properties: (i) easy choice of neural network size; (ii) fast training; (iii) strong ability to perform complicated function approximation and nonlinear modeling.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130729965","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
M. I. Széliga, P. F. Verdes, P. Granitto, H. Ceccatto
{"title":"Extracting driving signals from non-stationary time series","authors":"M. I. Széliga, P. F. Verdes, P. Granitto, H. Ceccatto","doi":"10.1109/SBRN.2002.1181443","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181443","url":null,"abstract":"We propose a simple method for the reconstruction of slow dynamics perturbations from non-stationary time series records. The method traces the evolution of the perturbing signal by simultaneously learning the intrinsic stationary dynamics and the time dependency of the changing parameter. For this purpose, an extra input unit is added to a feedforward artificial neural network and a suitable error function minimized in the training process. Testing of our algorithm on synthetic data shows its efficacy and allows extracting general criteria for applications on real-world problems. Finally, a preliminary study of the well-known sunspot time series recovers particular features of this series, including recently reported changes in solar activity during last century.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"190 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128219294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Automatic text categorization: case study","authors":"Renato Fernandes Corrêa, Teresa B Ludermir","doi":"10.1109/SBRN.2002.1181457","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181457","url":null,"abstract":"Text categorization is a process of classifying documents with regard to a group of one or more existent categories according to themes or concepts present in their contents. The most common application of it is in information retrieval systems (IRS) to document indexing. A method to transform text categorization into a viable task is to use machine-learning algorithms to automate text classification, allowing it to be carried out fast, into concise manner and in broad range. The objective of this work is to present and compare the results of experiments on text categorization using artificial neural networks of multilayer perceptron and self-organizing map types, and traditional machine-learning algorithms used in this task: C4.5 decision tree, PART decision rules and Naive Bayes classifier.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117021376","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A recurrent fuzzy neural network: learning and application","authors":"R. Ballini, F. Gomide","doi":"10.1109/SBRN.2002.1181460","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181460","url":null,"abstract":"Summary form only given. A novel recurrent neuro-fuzzy network is proposed in this paper. More specifically, we generalize the recurrent neuro-fuzzy network structure proposed by Ballini et al. (2001), which in turn is an improvement of the feedforward structure introduced by Caminhas et al. (1999). The network structure is composed by two structures: a fuzzy inference system and a neural network. The fuzzy inference system contains fuzzy neurons modeled with the aid of logic operations processed via t-norms and s-norms. The neural network is composed by nonlinear elements placed in series with the previous logical element. The network model implicitly encodes a set of if-then rules and its recurrent multi layer structure performs fuzzy inference. The recurrent fuzzy neural network is particularly suitable to model nonlinear dynamic systems and to learn sequences. Network learning involves three main phases: 1) uses a convenient modification of the vector quantization approach to granulate the input universes; 2) simply sets network connections and their initial, randomly chosen weights; and 3) uses two main paradigms to update the network weights: gradient descent and associative reinforcement learning. The performance of the recurrent neurofuzzy network is verified with an example. Computational experiments show that the fuzzy neural model learned is simpler and that learning is faster than its counterpart.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124380836","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A symbolic approach to gene expression time series analysis","authors":"Ivan G. Costa, F. D. Carvalho, M. D. Souto","doi":"10.1109/SBRN.2002.1181430","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181430","url":null,"abstract":"In the analysis of gene expression time series, emphasis has been given on the capture of shape similarity (or dissimilarity). A number of proximity functions have been proposed for this task. However, none of them will suitably measure shape similarity (or dissimilarity) with data containing multiple gene expression time series, unless special data handling is made. In this paper, a symbolic description of multiple gene expression time series, where each variable takes as a value a time series, in conjunction with a version of a proximity measure, is proposed. In this symbolic approach, the shape similarity of each time series is calculated independently, and aggregated at the end. Gene expression data from five distinct time series are presented to a symbolic dynamical clustering method and self-organising map algorithm. The quality of the results obtained is evaluated using gene annotation allowing a verification of this proposal's adequacy.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133504275","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
C. H. Barbosa, B. Melo, M. Vellasco, M. Pacheco, L. P. Vasconcellos
{"title":"Bayesian neural networks on the inference of distillation product quality","authors":"C. H. Barbosa, B. Melo, M. Vellasco, M. Pacheco, L. P. Vasconcellos","doi":"10.1109/SBRN.2002.1181447","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181447","url":null,"abstract":"The control of the distillation process in oil refineries requires the evaluation of product quality throughout the operation of the plant. This paper uses Bayesian neural networks, combined to several pre-processing and variable selection techniques, to develop systems for inferencing the quality of distillation products, for REPAR refinery (Refinaria do Parana), operated by PETROBRAS.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"56 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123577647","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Global optimization methods for designing and training neural networks","authors":"A. Yamazaki, Teresa B Ludermir, M. D. Souto","doi":"10.1109/SBRN.2002.1181455","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181455","url":null,"abstract":"This paper shows results of two approaches for the optimization of neural networks: one uses simulated annealing for optimizing both architectures and weights combined with backpropagation for fine tuning, while the other uses tabu search for the same purpose. Both approaches generate networks with good generalization performance (mean classification error of 1.68% for simulated annealing and 0.64% for tabu search) and low complexity (mean number of connections of 11.15 out of 36 for simulated annealing and 11.62 out of 36 for tabu search) for an odor recognition task in an artificial nose.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"39 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126708849","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Fuzzy Markov predictor with first and second-order dependences","authors":"M. A. Teixeira, Gerson Zaverucha","doi":"10.1109/SBRN.2002.1181439","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181439","url":null,"abstract":"We present two new versions of the fuzzy Markov predictor (FMP) with different dependences among the inputs: first-order and second-order dependences. The FMP is a modification of the hidden Markov model in order to enable it to predict numerical values. The FMP can be seen as an extension of the fuzzy Bayes predictor. These hybrid systems are applied to the task of monthly electric load forecasting and successfully compared with one fuzzy system, and two traditional forecasting methods: Box-Jenkins and Winters exponential smoothing.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127221290","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"SAR image classification using a neural classifier based on Fisher criterion","authors":"Alexsandro M. Jacob, E. M. Hemerly, D. Fernandes","doi":"10.1109/SBRN.2002.1181464","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181464","url":null,"abstract":"A supervised neural classifier based on Fisher criterion is implemented to classify two regions in a real speckled SAR image. Regions around pre-classified pixels are presented to train the neural network that learns a sub-optimal set of masks via back-propagation algorithm. Classification performance is evaluated by using the ground truth. Results with higher than 90% of correct classification are obtained. The results are also compared with a statistical classifier based on Kullback-Liebler distance via the Kappa coefficient.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"36 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130347268","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}