Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks最新文献

筛选
英文 中文
Atmospheric pressure applied to a neural network based short term load forecasting 应用于基于神经网络的大气压力短期负荷预测
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889752
A. P. Soares
{"title":"Atmospheric pressure applied to a neural network based short term load forecasting","authors":"A. P. Soares","doi":"10.1109/SBRN.2000.889752","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889752","url":null,"abstract":"The electric load is strongly related to meteorological conditions and forecast models depend on climatic studies. This work studies the influence of atmospheric pressure applied to load forecast, aimed to reduce the number of data acquisition sites and the cost related to assembly, operation and maintenance of the meteorological telemetry network. An experiment was made using a time series of the load, load with temperature, load with pressure and, finally, load with temperature and pressure. All systems were based on artificial neural networks (multilayered perceptron training by backpropagation algorithm).","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125859590","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
SVM-KM: speeding SVMs learning with a priori cluster selection and k-means SVM-KM:利用先验聚类选择和k-means加速svm学习
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889732
M. B. D. Almeida, A. Braga, J. P. Braga
{"title":"SVM-KM: speeding SVMs learning with a priori cluster selection and k-means","authors":"M. B. D. Almeida, A. Braga, J. P. Braga","doi":"10.1109/SBRN.2000.889732","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889732","url":null,"abstract":"A procedure called SVM-KM, based on clustering by k-means and to accelerate the training of support vector machines, is the main objective of the work. During the support vector machines (SVMs) optimization phase, training vectors near the separation margins, are likely to become support vector and must be preserved. Conversely, training vectors far from the margins are not in general taken into account for the SVM's design process. SVM-KM groups the training vectors in many clusters. Clusters formed only by a vector that belongs to the same class label can be disregard and only cluster centers are used. On the other hand, clusters with more than one class label are unchanged and all training vectors belonging to them are considered. Clusters with mixed composition are likely to happen near the separation margins and they may hold some support vectors. Consequently, the number of vectors in a SVM training is smaller and the training time can be decreased without compromising the generalization capability of the SVM.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"149 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122460403","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 135
Non-linear modelling and chaotic neural networks 非线性建模与混沌神经网络
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889706
A. J. Jones, Steve Margetts, P. Durrant, Alban P. M. Tsui
{"title":"Non-linear modelling and chaotic neural networks","authors":"A. J. Jones, Steve Margetts, P. Durrant, Alban P. M. Tsui","doi":"10.1109/SBRN.2000.889706","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889706","url":null,"abstract":"Proposes a simple methodology to construct an iterative neural network which mimics a given chaotic time series. The methodology uses the Gamma test to identify a suitable (possibly irregular) embedding of the chaotic time series from which a one step predictive model may be constructed. This model is then iterated to produce a close approximation to the original chaotic dynamics. Having constructed such networks we show how the chaotic dynamics may be stabilised using time-delayed feedback, which is a plausible method for stabilisation in biological neural systems. Using delayed feedback control, which is activated in the presence of a stimulus, such networks can behave as an associative memory, in which the act of recognition corresponds to stabilisation onto an unstable periodic orbit. We briefly illustrate how two identical dynamically independent copies of such a chaotic iterative network may be synchronised using the delayed feedback method. Although less biologically plausible, these techniques may have interesting applications in secure communications.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125457059","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Extraction of statistically dependent sources with temporal structure 具有时间结构的统计相关源的提取
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889714
A. Barros, A. Cichocki, N. Ohnishi
{"title":"Extraction of statistically dependent sources with temporal structure","authors":"A. Barros, A. Cichocki, N. Ohnishi","doi":"10.1109/SBRN.2000.889714","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889714","url":null,"abstract":"In this work we develop a very simple batch learning algorithm for semi-blind extraction of a desired source signal with temporal structure from linear mixtures. Although we use the concept of sequential blind extraction of sources and independent component analysis (ICA), we do not carry out the extraction in a completely blind manner neither we assume that sources are statistically independent. In fact, we show that the a priori information about the autocorrelation function of primary sources can be used to extract the desired signals (sources of interest) from their mixtures. Extensive computer simulations and real data application experiments confirm the validity and high performance of the proposed algorithm.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"43 20","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132064007","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Rule extraction from linear combinations of DIMLP neural networks dilp神经网络线性组合的规则提取
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889720
G. Bologna
{"title":"Rule extraction from linear combinations of DIMLP neural networks","authors":"G. Bologna","doi":"10.1109/SBRN.2000.889720","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889720","url":null,"abstract":"The problem of rule extraction from neural networks is NP-hard. This work presents a new technique to extract If-Then-Else rules from linear combinations of discretised interpretable multilayer perceptron (DIMLP) neural networks. Rules are extracted in polynomial time with respect to the dimensionality of the problem, the number of examples, and the size of the resulting network. Further, the degree of matching between extracted rules and neural network responses is 100%. Linear combinations of DIMLP networks were trained on 4 data sets related to the public domain. The extracted rules obtained are more accurate than those extracted from C4.5 decision trees on average.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"194 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131701311","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Continuous attractors in recurrent neural networks and phase space learning 递归神经网络中的连续吸引子与相空间学习
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889763
Rogério de Oliveira, L. Monteiro
{"title":"Continuous attractors in recurrent neural networks and phase space learning","authors":"Rogério de Oliveira, L. Monteiro","doi":"10.1109/SBRN.2000.889763","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889763","url":null,"abstract":"Recurrent networks can be used as associative memories where the stored memories represent fixed points to which the dynamics of the network converges. These networks, however, also can present continuous attractors, as limit cycles and chaotic attractors. The use of these attractors in recurrent networks for the construction of associative memories is argued. We provide a training algorithm for continuous attractors and present some numerical results of the learning method which involves genetic algorithms.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114895113","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A neural propositional reasoner that is goal-driven and works without pre-compiled knowledge 一种目标驱动的神经命题推理器,不需要预先编译的知识
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889749
P. Lima
{"title":"A neural propositional reasoner that is goal-driven and works without pre-compiled knowledge","authors":"P. Lima","doi":"10.1109/SBRN.2000.889749","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889749","url":null,"abstract":"This work presents the propositional version of a neural engine for finding proofs by refutation using the resolution principle. Such a neural architecture does not require special arrangements or different modules in order to do forward or backward reasoning, driven by the goal posed to it. Also, the neural engine is capable of performing monotonic reasoning with both complete and incomplete knowledge in an integrated fashion. In order to do so, it was necessary to provide the system with the ability to create new sentences (clauses). The neural mechanism presented herein is the first to our knowledge that does not require that the clauses of the knowledge base be either pre-encoded as constraints or learnt via examples, although the addition of these features to the system is not an impossibility.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"90 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124616453","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
A clustering method for improving the global search capability of genetic algorithms 一种提高遗传算法全局搜索能力的聚类方法
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889709
L. Schnitman, T. Yoneyama
{"title":"A clustering method for improving the global search capability of genetic algorithms","authors":"L. Schnitman, T. Yoneyama","doi":"10.1109/SBRN.2000.889709","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889709","url":null,"abstract":"This work concerns some heuristic concepts that can be used to improve the search capabilities and speed of convergence of genetic algorithms (GA) in terms of finding global solutions for problems of function optimization. The main idea is to group the members of the population into clusters using a local criterion to distinguish them. Pairing of individuals belonging to distinct clusters is then promoted in order to generate descendants with improved fitness conditions. Moreover, severely unfavorable regions are made to become an exclusion zone (EZ). The descendants that are generated close to an EZ have a reduced survival probability. The search for outlying clusters is based on a continuously adjusted mutation rate to increase the probability of finding the global minima.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"105 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128904241","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Selecting diverse members of neural network ensembles 选择神经网络集合的不同成员
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889748
H. Navone, P. F. Verdes, P. Granitto, H. Ceccatto
{"title":"Selecting diverse members of neural network ensembles","authors":"H. Navone, P. F. Verdes, P. Granitto, H. Ceccatto","doi":"10.1109/SBRN.2000.889748","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889748","url":null,"abstract":"Ensembles of artificial neural networks have been used as classification/regression machines, showing improved generalization capabilities that outperform those of single networks. However, it has been recognized that for aggregation to be effective the individual network must be as accurate and diverse as possible. An important problem is, then, how to choose the aggregate members in order to have an optimal compromise between these two conflicting conditions. We propose here a new method for selecting members of regression/classification ensembles that leads to small aggregates with few but very diverse individual predictors. Using artificial neural networks as individual learners, the algorithm is favorably tested against other methods in the literature, producing a remarkable performance improvement on the standard statistical databases used as benchmarks. In addition, and as a concrete application, we study the sunspot time series and predict the remaining of the current cycle 23 of solar activity.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131336501","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 22
Techniques for image compression: a comparative analysis 图像压缩技术:比较分析
Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks Pub Date : 2000-01-22 DOI: 10.1109/SBRN.2000.889747
P. R. Oliveira, R.A.F. Romero, L. G. Nonato, J. Mazucheli
{"title":"Techniques for image compression: a comparative analysis","authors":"P. R. Oliveira, R.A.F. Romero, L. G. Nonato, J. Mazucheli","doi":"10.1109/SBRN.2000.889747","DOIUrl":"https://doi.org/10.1109/SBRN.2000.889747","url":null,"abstract":"Some techniques for image compression are investigated in this article. The first one is the well known JPEG that is the most widely used technique for image compression. The second is principal component analysis (PCA), also called Karhunen-Loeve transform, that is a statistical method applied for multivariate data analysis and feature extraction. In the latter, two approaches are being considered. The first approach uses the classical statistical method and the other one is based on artificial neural networks. In a comparative study, the results obtained by PCA neural network for compressing medical images are analyzed together with those obtained by using the classical statistical method and JPEG compression standard technique.","PeriodicalId":448461,"journal":{"name":"Proceedings. Vol.1. Sixth Brazilian Symposium on Neural Networks","volume":"259 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2000-01-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134499302","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信