{"title":"A study of cross-validation and bootstrap as objective functions for genetic algorithms","authors":"E. D. Lacerda, A. Carvalho, Teresa B Ludermir","doi":"10.1109/SBRN.2002.1181451","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181451","url":null,"abstract":"This article addresses the problem of finding the adjustable parameters of a learning algorithm using genetic algorithms. This problem is also known as the model selection problem. Some model selection techniques (e.g., cross-validation and bootstrap) are combined with the genetic algorithms of different ways. Those combinations explore features of the genetic algorithms such as the ability for handling multiple and noise objective functions. The proposed multiobjective GA is quite general and can be applied to a large range of learning algorithms.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"21 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117242080","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
D. Devogelaere, M. Rijckaert, Osvaldo Goza Leon, G. Lemus
{"title":"Application of feedforward neural networks for soft sensors in the sugar industry","authors":"D. Devogelaere, M. Rijckaert, Osvaldo Goza Leon, G. Lemus","doi":"10.1109/SBRN.2002.1181426","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181426","url":null,"abstract":"Neural networks have been successfully applied as intelligent sensors for process modeling and control. In this paper, the application of soft sensors in the cane sugar industry is discussed. A neural network is trained on historical data to predict process quality variables so that it can replace the lab-test procedure. An immediate benefit of building intelligent sensors is that the neural network can predict product quality in a timely manner.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"63 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129537607","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. A. Teixeira, A. P. Braga, R. Takahashi, R. R. Saldanha
{"title":"Decisor implementation in neural model selection by multiobjective optimization","authors":"R. A. Teixeira, A. P. Braga, R. Takahashi, R. R. Saldanha","doi":"10.1109/SBRN.2002.1181480","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181480","url":null,"abstract":"This work presents a new learning scheme for improving the generalization of multilayer perceptrons (MLPs). The proposed multiobjective algorithm approach minimizes both the sum of squared error and the norm of network weight vectors to obtain the Pareto-optimal solutions. Since the Pareto-optimal solutions are not unique, we need a decision phase (\"decisor\") in order to choose the best one as a final solution by using a validation set. The final solution is expected to balance network variance and bias and, as a result, generates a solution with high generalization capacity, avoiding over and under fitting.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124228308","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Identification of failed (fissured) fuel rods in nuclear reactors using neural processing and principal component analysis","authors":"C. B. Teles, J. Seixas","doi":"10.1109/SBRN.2002.1181477","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181477","url":null,"abstract":"A possible way to detect failed (fissured) rods, within a nuclear fuel assembly, is sounding the rods with ultrasonic pulses and examining the received echo waveforms. The detection is performed by a multilayer feedforward neural classifier, trained according to the backpropagation algorithm. The classifier achieved a detection efficiency of 93% (for failed rods) with 3% as false-alarm probability. Data compaction through principal component analysis reduced the network's input vector to 1.5% of its original length, with no efficiency loss.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129451514","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mixture of experts applied to nonlinear dynamic systems identification: a comparative study","authors":"C. Lima, André L. V. Coelho, F. V. Zuben","doi":"10.1109/SBRN.2002.1181463","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181463","url":null,"abstract":"A mixture of experts (ME) model provides a modular approach wherein component neural networks are made specialists on subparts of a problem. In this framework, that follows the \"divide-and-conquer\" philosophy, a gating network learns how to softly partition the input space into regions to be each properly modeled by one or more expert networks. In this paper, we investigate the application of different ME variants to some multivariate nonlinear dynamic systems identification problems which are known to be difficult to be dealt with. The aim is to provide a comparative performance analysis between variable settings of the standard, gated, and localized ME models with more conventional NN models.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"26 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128574550","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"NeuroInflow: the new model to forecast average monthly inflow","authors":"M. Valença, Teresa B Ludermir","doi":"10.1109/SBRN.2002.1181438","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181438","url":null,"abstract":"In utilities using a mixture of hydroelectric and nonhydroelectric power, the economics of the hydroelectric plants depend upon the reservoir height and the inflow into the reservoir for several months into the future. Accurate forecasts of reservoir inflow allow the utility to feed proper amounts of fuel to individual plants, and to economically allocate the load between various nonhydroelectric plants. For this reasons, several companies in the Brazilian Electrical Sector use the linear time series models such as PARMA (periodic auto regressive moving average) models. This paper provides for river flow prediction a numerical comparison between nonlinear sigmoidal regression blocks networks (NSRBN), called NeuroInflow and PARMA models. The model was implemented to forecast monthly average inflow with a long-term prediction horizon (one to twelve months ahead). It was tested on 37 hydroelectric plants located in different river basins in Brazil. The results obtained in the evaluation of the performance of NeuroInflow were better than the results obtained with PARMA models.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"46 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129664833","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Adjacency matrix representation in evolutionary circuit synthesis","authors":"A. Mesquita","doi":"10.1109/SBRN.2002.1181469","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181469","url":null,"abstract":"Summary form only given. An analog circuit synthesis method based on evolutionary circuit techniques is proposed where the chromosome in the genetic algorithm is coded through the use of adjacency matrices. The method assumes that the circuit is described at the behavioral level in order to reduce the simulation time. It is shown that adjacency matrices coding of the chromosome reduces considerably the number of anomalous circuits generated by the genetic algorithm operations. Moreover, the proposed representation enables one to easily implement selection criteria to control the topological properties of the circuit.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122365602","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Speeding up autonomous learning by using state-independent option policies and termination improvement","authors":"Letícia Maria Friske, C. Ribeiro","doi":"10.1109/SBRN.2002.1181488","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181488","url":null,"abstract":"In reinforcement learning applications such as autonomous robot navigation, the use of options (macro-operators) instead of low level actions has been reported to produce learning speedup due to a more aggressive exploration of the state space. In this paper we present an evaluation of the use of option policies O/sub S/. Each option policy in this framework is a fixed sequence of actions, depending exclusively on the state in which the option is initiated. This contrasts with option policies O/sub /spl Pi//, more common in the literature and that correspond to action sequences that depend on the states visited during the execution of the options. One of our goals was to analyse the effects of a variation of the action sequence length for O/sub S/ policies. The main contribution of the paper, however, is a study on the use of a termination improvement (TI) technique which allows for the abortion of option execution if a more promising one is found. Experimental results show that TI for O/sub S/ options, whose benefits had already been reported for O/sub /spl Pi// options, can be much more effective - due to its adaptation of the size of the action sequence depending on the state where the option is initiated - than indiscriminately augmenting the option size in order to increase exploration of the state space.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116711461","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Using MLP networks to classify red wines and water readings of an electronic tongue","authors":"H. C. D. Sousa, A. Riul","doi":"10.1109/SBRN.2002.1181428","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181428","url":null,"abstract":"Feasible efforts have been made to mimic the human gustatory system through an \"artificial tongue\". This device comprises an array of sensing units that is able to differentiate tastes with a higher sensitivity than the biological system. Experimental results indicate that when the data generated by such sensing units are handled by artificial neural networks, this \"artificial tongue\" can successfully discriminate wines of different winemakers, vintage and grapes, as well as different brands of mineral water, distilled water and Milli-Q water. The accuracy achieved by the experiments suggests that the sensing units may be used to detect abnormal chemical substances in a production line or even set a new approach to control quality standards in food industry.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"6 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121677506","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
E. Burattini, M. D. Gregorio, Antonio de Francesco
{"title":"NSL: a neuro-symbolic language for monotonic and non-monotonic logical inferences","authors":"E. Burattini, M. D. Gregorio, Antonio de Francesco","doi":"10.1109/SBRN.2002.1181487","DOIUrl":"https://doi.org/10.1109/SBRN.2002.1181487","url":null,"abstract":"The complete definition of a Neuro-Symbolic Language (NSL), partially introduced by Burattini et al. (2000), for monotonic and non-monotonic logical inference by means of artificial neural networks (ANNs) is presented. Both the language and its compiler have been designed and implemented. It has been shown that the ANN model here adopted (neural forward chaining) is a massively parallel abstract interpreter of definite logic programs; moreover, inhibition is used to implement a neural form of logical negation. Previous compilers for translating the neural representation of a given problem into a VHDL software, which in turn can set electronic device like FPGA, has been modified to fit the new and more complete features of the language.","PeriodicalId":157186,"journal":{"name":"VII Brazilian Symposium on Neural Networks, 2002. SBRN 2002. Proceedings.","volume":"79 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2002-11-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115236556","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}