[Proceedings] 1991 IEEE International Joint Conference on Neural Networks最新文献

筛选
英文 中文
Visual inspection of soldered joints by using neural networks 基于神经网络的焊接点视觉检测
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170373
S. Jagannathan, S. Balakrishnan, N. Popplewell
{"title":"Visual inspection of soldered joints by using neural networks","authors":"S. Jagannathan, S. Balakrishnan, N. Popplewell","doi":"10.1109/IJCNN.1991.170373","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170373","url":null,"abstract":"The problem of solder joint inspection is viewed as a two-step process of pattern recognition and classification. A modified intelligent histogram regrading technique is used which divides the histogram of the captured image into different modes. Each distinct mode is identified, and the corresponding range of grey levels is separated and regraded by using neural networks. The output pattern of the networks is presented to a second stage of neural networks in order to select and interpret a histogram's features. A learning mechanism is also used which uses a backpropagation algorithm to successfully identify and classify the defective solder joints. The proposed technique has the high speed and low computational complexity typical of nonspatial techniques.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116735988","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Location and stability of the equilibria of nonlinear neural networks 非线性神经网络平衡点的定位与稳定性
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170664
M. Vidyasagar
{"title":"Location and stability of the equilibria of nonlinear neural networks","authors":"M. Vidyasagar","doi":"10.1109/IJCNN.1991.170664","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170664","url":null,"abstract":"The number, location and stability behavior of the equilibria of arbitrary nonlinear neural networks are analyzed without resorting to energy arguments based on assumptions of symmetric interactions or no self-interactions. The following results are proved. Let H=","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117128856","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
Communication network routing using neural nets-numerical aspects and alternative approaches 使用神经网络的通信网络路由-数值方面和替代方法
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170490
T. Fritsch, W. Mandel
{"title":"Communication network routing using neural nets-numerical aspects and alternative approaches","authors":"T. Fritsch, W. Mandel","doi":"10.1109/IJCNN.1991.170490","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170490","url":null,"abstract":"The authors discuss various approaches of using Hopfield networks in routing problems in computer communication networks. It is shown that the classical approach using the original Hopfield network leads to evident numerical problems, and hence is not practicable. The heuristic choice of the Lagrange parameters, as presented in the literature, can result in incorrect solutions for variable dimensions, or is very time consuming, in order to search the correct parameter sets. The modified method using eigenvalue analysis using predetermined parameters yields recognizable improvements. On the other hand, it is not able to produce correct solutions for different topologies with higher dimensions. From a numerical viewpoint, determining the eigenvalues of the connection matrix involves severe problems, such as stiffness, and shows evident instability of the simulated differential equations. The authors present possible alternative approaches such as the self-organizing feature map and modifications of the Hopfield net, e.g. mean field annealing, and the Pottglas model.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116069536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 44
An implementation of short-timed speech recognition on layered neural nets 基于分层神经网络的短时语音识别实现
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170719
Haizhou Li, Bingzheng Xu
{"title":"An implementation of short-timed speech recognition on layered neural nets","authors":"Haizhou Li, Bingzheng Xu","doi":"10.1109/IJCNN.1991.170719","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170719","url":null,"abstract":"The authors show a new way to handle the sequential nature of speech signals in multilayer perceptrons (MLPs) or other neural net machines. A static model in the form of state transition probability matrices representing short speech units such as syllables which correspond to Chinese utterances of isolated characters were adopted and as learning patterns for MLPs. The network architecture and learning algorithms are described. Experimental results on speech recognition are included.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116144428","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
A new learning approach to enhance the storage capacity of the Hopfield model 一种新的学习方法来提高Hopfield模型的存储容量
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170650
H. Oh, S. Kothari
{"title":"A new learning approach to enhance the storage capacity of the Hopfield model","authors":"H. Oh, S. Kothari","doi":"10.1109/IJCNN.1991.170650","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170650","url":null,"abstract":"A new learning technique is introduced to solve the problem of the small and restrictive storage capacity of the Hopfield model. The technique exploits the maximum storage capacity. It fails only if appropriate weights do not exist to store the given set of patterns. The technique is not based on the concept of function minimization. Thus, there is no danger of getting stuck in local minima. The technique is free from the step size and moving target problems. Learning speed is very fast and depends on difficulties presented by the training patterns and not so much on the parameters of the algorithm. The technique is scalable. Its performance does not degrade as the problem size increases. An extensive analysis of the learning technique is provided through simulation results.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115134102","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 15
Improving error tolerance of self-organizing neural nets 提高自组织神经网络的容错性
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170279
F. Sha, Q. Gan
{"title":"Improving error tolerance of self-organizing neural nets","authors":"F. Sha, Q. Gan","doi":"10.1109/IJCNN.1991.170279","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170279","url":null,"abstract":"A hybrid neural net (HNN) combining the network introduced by G.A. Carpenter and S. Grossberg (1987, 1988) and the Hopfield associative memory (HAM) is developed. HAM diminishes noise in samples and provides ART1 samples as inputs. In order to match the capacity of HAM with that of ART1, a new recalling algorithm (NHAM) is also introduced to enlarge the capacity of HAM. Based on NHAM and HNN, a revised version of HNN (RHNN) is introduced. The difference between RHNN and HNN is that RHNN has feedback loops, while HNN has only feedforward paths. The ART1 in RHNN supplies information for HAM to recall memories. Computer simulation demonstrated that RHNN has several advantages.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115280576","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
PPNN: a faster learning and better generalizing neural net PPNN:一个更快的学习和更好的泛化神经网络
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170513
B. Xu, L. Zheng
{"title":"PPNN: a faster learning and better generalizing neural net","authors":"B. Xu, L. Zheng","doi":"10.1109/IJCNN.1991.170513","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170513","url":null,"abstract":"It is pointed out that the planar topology of the current backpropagation neural network (BPNN) sets limits to the solution of the slow convergence rate problem, local minima, and other problems associated with BPNN. The parallel probabilistic neural network (PPNN) using a novel neural network topology, stereotopology, is proposed to overcome these problems. The learning ability and the generation ability of BPNN and PPNN are compared for several problems. Simulation results show that PPNN was capable of learning various kinds of problems much faster than BPNN, and also generalized better than BPNN. It is shown that the faster, universal learnability of PPNN was due to the parallel characteristic of PPNN's stereotopology, and the better generalization ability came from the probabilistic characteristic of PPNN's memory retrieval rule.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124836946","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Implementation of visual reconstruction networks-Alternatives to resistive networks 视觉重建网络的实现——电阻网络的替代方案
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170649
D. Mansor, D. Suter
{"title":"Implementation of visual reconstruction networks-Alternatives to resistive networks","authors":"D. Mansor, D. Suter","doi":"10.1109/IJCNN.1991.170649","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170649","url":null,"abstract":"The resistive grid approach has been adopted by the Harris coupled depth-slope analog network and generalized for regularization involving arbitrary degrees of smoothness. The authors consider implementations of arbitrary order regularization networks which do not require resistive grids. The approach followed is to generalize the original formulation of J.G. Harris (1987) and then to follow alternative paths to analog circuit realization allowed by the generalization.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124889787","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Neural network training using homotopy continuation methods 用同伦延拓方法训练神经网络
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170769
J. Chow, L. Udpa, S. Udpa
{"title":"Neural network training using homotopy continuation methods","authors":"J. Chow, L. Udpa, S. Udpa","doi":"10.1109/IJCNN.1991.170769","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170769","url":null,"abstract":"Neural networks are widely used in performing classification tasks. The networks are traditionally trained using gradient methods to minimize the training error. These techniques, however, are highly susceptible to getting trapped in local minima. The authors propose an innovative approach to obtain the global minimum of the training error. The globally optimum solution can be obtained by employing the homotopy continuation method for minimizing the classification error during training. Two different approaches are considered. The first approach involves the polynomial modeling of the nodal activation function and the second approach involves the traditional sigmoid function. Results illustrating the superiority of the homotopy method over the gradient descent method are presented.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"113969862","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Hopfield network with O(N) complexity using a constrained backpropagation learning 基于约束反向传播学习的复杂度为0 (N)的Hopfield网络
[Proceedings] 1991 IEEE International Joint Conference on Neural Networks Pub Date : 1991-11-18 DOI: 10.1109/IJCNN.1991.170606
G. Martinelli, R. Prefetti
{"title":"Hopfield network with O(N) complexity using a constrained backpropagation learning","authors":"G. Martinelli, R. Prefetti","doi":"10.1109/IJCNN.1991.170606","DOIUrl":"https://doi.org/10.1109/IJCNN.1991.170606","url":null,"abstract":"A novel associative memory model is presented, which is derived from the Hopfield discrete neural network. Its architecture is greatly simplified because the number of interconnections grows only linearly with the dimensionality of the stored patterns. It makes use of a modified backpropagation algorithm as a learning tool. During the retrieval phase the network operates as an autoassociative BAM (directional associative memory), which searches for a minimum of an appropriate energy function. Computer simulations point out the good performances of the proposed learning method in terms of capacity and number of spurious stable states.<<ETX>>","PeriodicalId":211135,"journal":{"name":"[Proceedings] 1991 IEEE International Joint Conference on Neural Networks","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"1991-11-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122759997","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信