IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)最新文献

筛选
英文 中文
Analysis and prediction of cranberry growth with dynamical neural network models 动态神经网络模型对蔓越莓生长的分析与预测
C. H. Chen, Bichuan Shen
{"title":"Analysis and prediction of cranberry growth with dynamical neural network models","authors":"C. H. Chen, Bichuan Shen","doi":"10.1109/IJCNN.1999.836208","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.836208","url":null,"abstract":"Cranberry plants are very sensitive to weather and other conditions. In this paper, the condition of cranberry growth is analyzed through PCA (principle component analysis) of the minimum cranberry spectral match measurement data. Three neural network models are applied to the one-month ahead prediction. The simulation results show the high performance modeling ability of these neural networks. The reliable prediction provided by the dynamic neural networks will be useful for the farmers to monitor and control the cranberry growth process.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115234532","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
On the conditions of outer-supervised feedforward neural networks for null cost learning 外监督前馈神经网络零代价学习的条件
De-shuang Huang
{"title":"On the conditions of outer-supervised feedforward neural networks for null cost learning","authors":"De-shuang Huang","doi":"10.1109/IJCNN.1999.831061","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831061","url":null,"abstract":"This paper investigates, from the viewpoint of linear algebra, the local minima of least square error cost functions defined at the outputs of outer-supervised feedforward neural networks (FNN). For a specific case, we also show that those spacedly colinear samples (probably output by the final hidden layer) will be easily separated with null-cost error function even if the condition M/spl ges/N is not satisfied. In the light of these conclusions we shall give a general method for designing a suitable architecture network to solve a specific problem.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115413075","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Design and analysis of neural networks for systems optimization 用于系统优化的神经网络设计与分析
I. Silva, M. E. Bordon, A. Souza
{"title":"Design and analysis of neural networks for systems optimization","authors":"I. Silva, M. E. Bordon, A. Souza","doi":"10.1109/IJCNN.1999.831583","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831583","url":null,"abstract":"Artificial neural networks are dynamic systems consisting of highly interconnected and parallel nonlinear processing elements that are shown to be extremely effective in computation. This paper presents an architecture of artificial neural networks that can be used to solve several classes of optimization problems. More specifically, a modified Hopfield network is developed and its internal parameters are computed using the valid-subspace technique. Among the problems that can be treated by the proposed approach include combinational optimization problems and dynamic programming problems.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115430109","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Neural networks for consciousness: the central representation 意识的神经网络:中心表征
John G. Taylor
{"title":"Neural networks for consciousness: the central representation","authors":"John G. Taylor","doi":"10.1109/IJCNN.1999.831462","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831462","url":null,"abstract":"A framework is developed, and criteria thereby deduced, for a neural site to be regarded as essential for the creation of consciousness. Various sites in the brain are considered but only very few are found to satisfy all of the criteria. The framework proposed here is barred on the notion of the central representation regarded as being composed of information deemed intrinsic to awareness. In particular, the central representation is suggested as being in the inferior parietal lobes. Implications of this identification are discussed.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"80 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115702196","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 5
Self-trapping in an attractor neural network with nearest neighbor synapses mimics full connectivity 具有最近邻突触的吸引子神经网络中的自捕获模拟了完全连接
R. Pavloski, M. Karimi
{"title":"Self-trapping in an attractor neural network with nearest neighbor synapses mimics full connectivity","authors":"R. Pavloski, M. Karimi","doi":"10.1109/IJCNN.1999.831586","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831586","url":null,"abstract":"A means of providing the feedback necessary for an associative memory is suggested by self-trapping, the development of localization phenomena and order in coupled physical systems. Following the lead of Hopfield (1982, 1984) who exploited the formal analogy of a fully-connected ANN to an infinite ranged interaction Ising model, we have carried through a similar development to demonstrate that self-trapping networks (STNs) with only near-neighbor synapses develop attractor states through localization of a self-trapping input. The attractor states of the STN are the stored memories of this system, and are analogous to the magnetization developed in a self-trapping 1D Ising system. Post-synaptic potentials for each stored memory become trapped at non-zero valves and a sparsely-connected network evolves to the corresponding state. Both analytic and computational studies of the STN show that this model mimics a fully-connected ANN.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"102 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115745958","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Generation of explicit knowledge from empirical data through pruning of trainable neural networks 通过修剪可训练的神经网络,从经验数据中生成显式知识
Alexander N Gorban, E. M. Mirkes, V. G. Tsaregorodtsev
{"title":"Generation of explicit knowledge from empirical data through pruning of trainable neural networks","authors":"Alexander N Gorban, E. M. Mirkes, V. G. Tsaregorodtsev","doi":"10.1109/IJCNN.1999.830876","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.830876","url":null,"abstract":"This paper presents a generalized technology of extraction of explicit knowledge from data. The main ideas are: 1) maximal reduction of network complexity (not only removal of neurons or synapses, but removal all the unnecessary elements and signals and reduction of the complexity of elements); 2) using of adjustable and flexible pruning process (the user should have a possibility to prune network on his own way in order to achieve a desired network structure for the purpose of extraction of rules of desired type and form); and 3) extraction of rules not in predetermined but any desired form. Some considerations and notes about network architecture and training process and applicability of currently developed pruning techniques and rule extraction algorithms are discussed. This technology, being developed by us for more than 10 years, allowed us to create dozens of knowledge-based expert systems.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116658130","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 11
The /spl alpha/-EM learning and its cookbook: from mixture-of-expert neural networks to movie random field -EM学习及其食谱:从混合专家神经网络到电影随机场
Y. Matsuyama, T. Ikeda, Tomoaki Tanaka, S. Furukawa, N. Takeda, Takeshi Niimoto
{"title":"The /spl alpha/-EM learning and its cookbook: from mixture-of-expert neural networks to movie random field","authors":"Y. Matsuyama, T. Ikeda, Tomoaki Tanaka, S. Furukawa, N. Takeda, Takeshi Niimoto","doi":"10.1109/IJCNN.1999.831162","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831162","url":null,"abstract":"The /spl alpha/-EM algorithm is a proper extension of the traditional log-EM algorithm. This new algorithm is based on the /spl alpha/-logarithm, while the traditional one uses the logarithm. The case of /spl alpha/=-1 corresponds to the log-EM algorithm. Since the speed of the /spl alpha/-EM algorithm was reported for learning problems, this paper shows that closed-form E-steps can be obtained for a wide class of problems. There is a set of common techniques. That is, a cookbooks for the /spl alpha/-EM algorithm is presented. The recipes include unsupervised neural networks, supervised neural networks for various gating, hidden Markov models and Markov random fields for moving object segmentation. Reasoning for the speedup is also given.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"37 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116984822","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Time topology for the self-organizing map 自组织映射的时间拓扑
P. Somervuo
{"title":"Time topology for the self-organizing map","authors":"P. Somervuo","doi":"10.1109/IJCNN.1999.832671","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.832671","url":null,"abstract":"Time information of the input data is used for evaluating the goodness of the self-organizing map to store and represent temporal feature vector sequences. A new node neighborhood is defined for the map which takes the temporal order of the input samples into account. A connection is created between those two map modes which are the best-matching units for two successive input samples in time. This results in the time-topology preserving network.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":" 17","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120943266","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 9
Approximation of a function and its derivatives in feedforward neural networks 前馈神经网络中函数及其导数的逼近
E. Basson, A. Engelbrecht
{"title":"Approximation of a function and its derivatives in feedforward neural networks","authors":"E. Basson, A. Engelbrecht","doi":"10.1109/IJCNN.1999.831531","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.831531","url":null,"abstract":"A new learning algorithm is presented that learns a function and its first-order derivatives. Derivatives are learned together with the function using gradient descent. Preliminary results show that the algorithm accurately approximates the derivatives.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"19 2","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"120993190","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
A neural network endowed with symbolic processing ability 具有符号处理能力的神经网络
D. Vogiatzis, A. Stafylopatis
{"title":"A neural network endowed with symbolic processing ability","authors":"D. Vogiatzis, A. Stafylopatis","doi":"10.1109/IJCNN.1999.830809","DOIUrl":"https://doi.org/10.1109/IJCNN.1999.830809","url":null,"abstract":"We propose a neural network method for the generation of symbolic expressions using reinforcement learning. According to the proposed method, a human decides on the kind and number of primitive functions which, with the appropriate composition (in the mathematical sense), can represent a mapping between two domains. The appropriate composition is achieved by an agent which tries many compositions and receives a reward depending on the quality of the composed function.","PeriodicalId":157719,"journal":{"name":"IJCNN'99. International Joint Conference on Neural Networks. Proceedings (Cat. No.99CH36339)","volume":"82 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1999-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127486303","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信