Raul Silva Barros, O. Cortes, R. Lopes, Josenildo Costa da Silva
{"title":"A Hybrid Algorithm for Solving the Economic Dispatch Problem","authors":"Raul Silva Barros, O. Cortes, R. Lopes, Josenildo Costa da Silva","doi":"10.1109/BRICS-CCI-CBIC.2013.108","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.108","url":null,"abstract":"The purpose of this work is to apply a hybrid algorithm based on Particle Swarm Optimization (PSO) and Genetic Algorithms (GA) for solving the problem of Economic Dispatch, which is based on supplying an energy demand, subjected to some restriction and reach out the best possible cost. Basically, we use the mutation operator from GAs aiming to explore regions in the search space that cannot be reached out by the canonical version of PSO. The new algorithm shows good results when applied to solve the cases based on 3, 13 and 20 generators, respectively. Our results are compared against the canonical PSO and other ones available in the literature.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127312802","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
R. A. Reboucas, Quenaz da Cruz Eller, Mateus Habermann, Elcio Hideiti Shiguemori
{"title":"Visual Odometry and Moving Objects Localization Using ORB and RANSAC in Aerial Images Acquired by Unmanned Aerial Vehicles","authors":"R. A. Reboucas, Quenaz da Cruz Eller, Mateus Habermann, Elcio Hideiti Shiguemori","doi":"10.1109/BRICS-CCI-CBIC.2013.79","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.79","url":null,"abstract":"In this paper the visual odometry and the localization of moving objects from aerial images are addressed. The techniques used in this work are the Oriented FAST and Rotated BRIEF (ORB) descriptor to detect and extract the interest points and the Random Sample Consensus (RANSAC) method to estimate the parameters from a matched points matrix for finding the camera translation. The visual odometry and morphological operations to point out moving objects have been performed.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"153 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115578649","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A New Algorithm Based on Differential Evolution for Combinatorial Optimization","authors":"André L. Maravilha, J. A. Ramírez, F. Campelo","doi":"10.1109/BRICS-CCI-CBIC.2013.21","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.21","url":null,"abstract":"Differential evolution (DE) was originally designed to solve continuous optimization problems, but recent works have been investigating this algorithm for tackling combinatorial optimization (CO), particularly in permutation-based combinatorial problems. However, most DE approaches for combinatorial optimization are not general approaches to CO, being exclusive for per mutational problems and often failing to retain the good features of the original continuous DE. In this work we introduce a new DE-based technique for combinatorial optimization to addresses these issues. The proposed method employs operations on sets instead of the classical arithmetic operations, with the DE generating smaller sub problems to be solved. This new approach can be applied to general CO problems, not only permutation-based ones. We present results on instances of the traveling salesman problem to illustrate the adequacy of the proposed algorithm, and compare it with existing approaches.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114622599","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Spatial Cognition Degree of Development Classification Using Artificial Neural Networks and Largest Lyapunov Exponents","authors":"G. Maron, D. Barone, E. A. Ramos","doi":"10.1109/BRICS-CCI-CBIC.2013.88","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.88","url":null,"abstract":"Thirty-Seven undergraduate students (23 engineering students, 14 social and human science students) had their electroencephalogram (EEG) recorded during the performing of mental rotation and recognition of virtual tridimensional geometric patterns tasks. Their spatial cognition degree of development was assessed by a BPR-5 psychological test. The Largest Lyapunov Exponent (LLE) was calculated from each of the 8 EEG channels recorded: FP1, FP2, F3, F4, T3, T4, P3, and P4. The LLEs were used as inputs for 3 different artificial neural networks topologies: i) multilayer perceptron, ii) radial base function, and iii) voted perceptron. Then the best results obtained using each topology is compared with the results obtained using the other topologies.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"60 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127766377","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
L. F. Coletta, Eduardo R. Hruschka, A. Acharya, Joydeep Ghosh
{"title":"Towards the Use of Metaheuristics for Optimizing the Combination of Classifier and Cluster Ensembles","authors":"L. F. Coletta, Eduardo R. Hruschka, A. Acharya, Joydeep Ghosh","doi":"10.1109/BRICS-CCI-CBIC.2013.86","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.86","url":null,"abstract":"Unsupervised models can provide supplementary soft constraints to help classify new data since similar instances are more likely to share the same class label. In this context, we investigate how to make an existing algorithm, named C3E (from Combining Classifier and Cluster Ensembles), more user-friendly by automatically tunning its main parameters with the use of metaheuristics. In particular, the C3E algorithm is based on a general optimization framework that takes as input class membership estimates from existing classifiers, as well as a similarity matrix from a cluster ensemble operating solely on the new (target) data to be classified, and yields a consensus labeling of the new data. To do so, two parameters have to be defined a priori, namely: the relative importance of classifier and cluster ensembles and the number of iterations of the algorithm. In some practical applications, these parameters can be optimized via (time consuming) grid search approaches based on cross-validation procedures. This paper shows that metaheuristics can be more computationally efficient alternatives for optimizing such parameters. More precisely, analyses of statistical significance made from experiments performed on fourteen datasets show that five metaheuristics can yield classifiers as accurate as those obtained from grid search, but taking half the running time.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"84 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128167889","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A WiSARD-Based Approach to CDnet","authors":"Massimo De Gregorio, Maurizio Giordano","doi":"10.1109/BRICS-CCI-CBIC.2013.37","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.37","url":null,"abstract":"In this paper, we present a WiSARD-based system (CwisarD) facing the problem of change detection (CD) in multiple images of the same scene taken at different time, and, in particular, motion in videos of the same view taken by a static camera. Although the proposed weightless neural approach is very simple and straightforward, it provides very good results in challenging with others approaches on the ChangeDetection.net benchmark dataset (CDnet).","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134207118","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Jonathan Cardoso Silva, G. Cruz, C. Vinhal, D. R. C. Silva, C. Bastos-Filho
{"title":"Comparing MOPSO Approaches for Hydrothermal Systems Operation Planning","authors":"Jonathan Cardoso Silva, G. Cruz, C. Vinhal, D. R. C. Silva, C. Bastos-Filho","doi":"10.1109/BRICS-CCI-CBIC.2013.97","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.97","url":null,"abstract":"Hydrothermal operational planning is categorized as an optimization problem that demands operational strategies of hydroelectric power plants in order to minimize the use of thermoelectric power plants, while maintaining the highest possible level of system's reservoirs during planning period. Moreover, the problem must meet a set of complex constraints. We showed in this paper that it is possible to tackle the medium-term planning of hydrothermal systems as a multi-objective problem. The particles were represented as vectors indicating the monthly generation of hydropower. We applied some three recent swarm based multi-objective optimizers, MOPSO-CDR, MOPSO-DFR and SMPSO. This trade-off is presented in Pareto Fronts, which can be used for decision making. Among the assessed approaches involving a system composed of eight Brazilian hydroelectric plants, we observed that the MOPSO-CDR returned the best results and it is worth to include seeds from mono-objective approaches to improve the convergence capacity. We included the result achieved by the PSO-CLANM algorithm and it generated effective results.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134314292","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"High Level Classification Totally Based on Complex Networks","authors":"M. Carneiro, Liang Zhao","doi":"10.1109/BRICS-CCI-CBIC.2013.90","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.90","url":null,"abstract":"Differently from traditional machine learning techniques applied to data classification, high level classification considers not only the physical features of the data (distance, similarity or distribution), but also the pattern formation of the data. In this latter case, a set of complex network measures are employed because of their abilities to capture spatial, functional and topological relations. Although high level techniques offer powerful features, good classification performance is usually obtained by combining them with some low level algorithms, which, in turn, reduces the efficiency of the overall technique. A priori, the reason is that low level and high level techniques provide different visions of classification. In this way, one cannot simply substitute another. This paper presents a data classification technique in which low level and high level classifications are embedded in a unique scheme, i.e., the proposed technique does not need a separated low level technique. The novelty is the use of a simple and recently proposed complex network measure, named component efficiency. Thus, our algorithm computes the efficiency of information exchanging among vertices in each component and the resulting values are used to drive the classification of the new instances i.e., a new instance will be classified into one of the components (class), in which their local features are in conformity with the insertion of the new instance. The experiments performed with artificial and real-world data sets show our approach totally based on complex networks is promising and it provides better results than some traditional classification techniques.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"35 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115803034","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Combined Active and Semi-supervised Learning Using Particle Walking Temporal Dynamics","authors":"Fabricio A. Breve","doi":"10.1109/BRICS-CCI-CBIC.2013.14","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.14","url":null,"abstract":"Both Semi-Supervised Leaning and Active Learning are techniques used when unlabeled data is abundant, but the process of labeling them is expensive and/or time consuming. In this paper, those two machine learning techniques are combined into a single nature-inspired method. It features particles walking on a network built from the data set, using a unique random-greedy rule to select neighbors to visit. The particles, which have both competitive and cooperative behavior, are created on the network as the result of label queries. They may be created as the algorithm executes and only nodes affected by the new particles have to be updated. Therefore, it saves execution time compared to traditional active learning frameworks, in which the learning algorithm has to be executed several times. The data items to be queried are select based on information extracted from the nodes and particles temporal dynamics. Two different rules for queries are explored in this paper, one of them is based on querying by uncertainty approaches and the other is based on data and labeled nodes distribution. Each of them may perform better than the other according to some data sets peculiarities. Experimental results on some real-world data sets are provided, and the proposed method outperforms the semi-supervised learning method, from which it is derived, in all of them.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125747796","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Learning Finite-State Machines with Classical and Mutation-Based Ant Colony Optimization: Experimental Evaluation","authors":"D. Chivilikhin, V. Ulyantsev","doi":"10.1109/BRICS-CCI-CBIC.2013.93","DOIUrl":"https://doi.org/10.1109/BRICS-CCI-CBIC.2013.93","url":null,"abstract":"The problem of learning finite-state machines (FSM) is tackled by three Ant Colony Optimization (ACO) algorithms. The first two classical ACO algorithms are based on the classical ACO combinatorial problem reduction, where nodes of the ACO construction graph represent solution components, while full solutions are built by the ants in the process of foraging. The third recently introduced mutation-based ACO algorithm employs another problem mapping, where construction graph nodes represent complete solutions. Here, ants travel between solutions to find the optimal one. In this paper we try to take a step back from the mutation-based ACO to find out if classical ACO algorithms can be used for learning FSMs. It was shown that classical ACO algorithms are inefficient for the problem of learning FSMs in comparison to the mutation-based ACO algorithm.","PeriodicalId":306195,"journal":{"name":"2013 BRICS Congress on Computational Intelligence and 11th Brazilian Congress on Computational Intelligence","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2013-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130558752","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}