International 1989 Joint Conference on Neural Networks最新文献

筛选
英文 中文
Hybrid distributed/local connectionist architectures 混合分布式/本地连接架构
International 1989 Joint Conference on Neural Networks Pub Date : 1992-01-02 DOI: 10.1109/IJCNN.1989.118344
Tariq Samad
{"title":"Hybrid distributed/local connectionist architectures","authors":"Tariq Samad","doi":"10.1109/IJCNN.1989.118344","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118344","url":null,"abstract":"Summary form only given, as follows. A class of neural network architectures is described that uses both distributed and local representation. The distributed representations are used for input and output, thereby enabling associative, noise-tolerant interaction with the environment. Internally, all representations are fully local. This simplifies weight assignment and makes the networks easy to configure for specific applications. These hybrid distributed/local architectures are especially useful for applications where structured information needs to be represented. Three such applications are briefly discussed: a scheme for knowledge representation, a connectionist rule-based system, and a knowledge-base browser.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"9 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1992-01-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"114755910","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 8
A new back-propagation algorithm with coupled neuron 一种新的耦合神经元反向传播算法
International 1989 Joint Conference on Neural Networks Pub Date : 1991-09-01 DOI: 10.1109/IJCNN.1989.118442
M. Fukumi, S. Omatu
{"title":"A new back-propagation algorithm with coupled neuron","authors":"M. Fukumi, S. Omatu","doi":"10.1109/IJCNN.1989.118442","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118442","url":null,"abstract":"Summary form only given, as follows. A novel algorithm is developed for training multilayer fully connected feedforward networks of coupled neurons with both signoid and signum functions. Such networks can be trained by the familiar backpropagation algorithm since the coupled neuron (CONE) proposed uses the differentiable sigmoid function for its trainability. The algorithm is called CNR, or coupled neuron rule. The backpropagation (BP) and MRII algorithms which have both advantages and disadvantages have been developed earlier. The CONE takes advantages of the key ideas of both methods. By applying CNR to a simple network, it is shown that the convergence of the output error is much faster than that of the BP method when the variable learning rate is used. Finally, simulation results illustrate the effective learning algorithm.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1991-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130581579","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
A novel objective function for improved phoneme recognition using time delay neural networks 一种利用时滞神经网络改进音素识别的新目标函数
International 1989 Joint Conference on Neural Networks Pub Date : 1990-06-01 DOI: 10.1109/IJCNN.1989.118586
J. Hampshire, A. Waibel
{"title":"A novel objective function for improved phoneme recognition using time delay neural networks","authors":"J. Hampshire, A. Waibel","doi":"10.1109/IJCNN.1989.118586","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118586","url":null,"abstract":"The authors present single- and multispeaker recognition results for the voiced stop consonants /b, d, g/ using time-delay neural networks (TDNN), a new objective function for training these networks, and a simple arbitration scheme for improved classification accuracy. With these enhancements a median 24% reduction in the number of misclassifications made by TDNNs trained with the traditional backpropagation objective function is achieved. This redundant results in /b, d, g/ recognition rates that consistently exceed 98% for TDNNs trained with individual speakers; it yields a 98.1% recognition rate for a TDNN trained with three male speakers.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124182205","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 235
Optimization of a digital neuron design 数字神经元设计的优化
International 1989 Joint Conference on Neural Networks Pub Date : 1990-04-03 DOI: 10.1145/99633.99644
F. Kampf, P. Koch, K. Roy, M. Sullivan, Z. Delalic, S. DasGupta
{"title":"Optimization of a digital neuron design","authors":"F. Kampf, P. Koch, K. Roy, M. Sullivan, Z. Delalic, S. DasGupta","doi":"10.1145/99633.99644","DOIUrl":"https://doi.org/10.1145/99633.99644","url":null,"abstract":"Summary form only given, as follows. Artificial neural network models, composed of many nonlinear processing elements operating in parallel, have been extensively simulated in software. The real estate required for neurons and their interconnections has been the major hindrance for hardware implementation. Therefore, a reduction in neuron size is highly advantageous. A digital neuron design consisting of an arithmetic logic unit (ALU) has been implemented to conform to the hard-limiting threshold function. Studies on reducing the ALU size, utilizing Monte-Carlo simulations, indicate that the effect of such a reduction on network reliability and efficiency is not detrimental. Neurons with reduced ALU size operate with the same computational abilities as full-size neurons.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-04-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115251387","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
Variants of self-organizing maps 自组织地图的变体
International 1989 Joint Conference on Neural Networks Pub Date : 1990-03-01 DOI: 10.1109/IJCNN.1989.118292
J. Kangas, T. Kohonen, Jorma T. Laaksonen
{"title":"Variants of self-organizing maps","authors":"J. Kangas, T. Kohonen, Jorma T. Laaksonen","doi":"10.1109/IJCNN.1989.118292","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118292","url":null,"abstract":"Self-organizing maps have a connection with traditional vector quantization. A characteristic which makes them resemble certain biological brain maps, however, is the spatial order of their responses which is formed in the learning process. Two innovations are discussed: dynamic weighting of the input signals at each input of each cell, which improves the ordering when very different input signals are used, and definition of neighborhoods in the learning algorithm by the minimum spanning tree, which provides a far better and faster approximation of prominently structured density functions. It is cautioned that if the maps are used for pattern recognition and decision processes, it is necessary to fine-tune the reference vectors such that they directly define the decision borders.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"42 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124865076","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 371
Multitarget tracking with an optical neural net using a quadratic energy function 基于二次能量函数的光学神经网络多目标跟踪
International 1989 Joint Conference on Neural Networks Pub Date : 1990-03-01 DOI: 10.1117/12.969762
M. Yee, E. Barnard, D. Casasent
{"title":"Multitarget tracking with an optical neural net using a quadratic energy function","authors":"M. Yee, E. Barnard, D. Casasent","doi":"10.1117/12.969762","DOIUrl":"https://doi.org/10.1117/12.969762","url":null,"abstract":"Summary form only given, as follows. Multitarget tracking over consecutive pairs of time frames is accomplished with a neural net. This involves position and velocity measurements of the targets and a quadratic neural energy function. Simulation data are presented, and an optical implementation is discussed.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"20 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-03-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117072445","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses 具有10240个“浮门”突触的电可训练人工神经网络(ETANN)
International 1989 Joint Conference on Neural Networks Pub Date : 1990-01-03 DOI: 10.1109/IJCNN.1989.118698
M. Holler, S. Tam, H. Castro, Robert G. Benson
{"title":"An electrically trainable artificial neural network (ETANN) with 10240 'floating gate' synapses","authors":"M. Holler, S. Tam, H. Castro, Robert G. Benson","doi":"10.1109/IJCNN.1989.118698","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118698","url":null,"abstract":"The use of floating-gate nonvolatile memory technology for analog storage of connection strengths, or weights, has previously been proposed and demonstrated. The authors report the analog storage and multiply characteristics of a new floating-gate synapse and further discuss the architecture of a neural network which uses this synapse cell. In the architecture described 8192 synapses are used to interconnect 64 neurons fully and to connect the 64 neurons to each of 64 inputs. Each synapse in the network multiplies a signed analog voltage by a stored weight and generates a differential current proportional to the product. Differential currents are summed on a pair of bit lines and transferred through a sigmoid function, appearing at the neuron output as an analog voltage. Input and output levels are compatible for ease in cascade-connecting these devices into multilayer networks. The width and height of weight-change pulses are calculated. The synapse cell size is 2009 mu m/sup 2/ using 1- mu m CMOS EEPROM technology.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"76 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-01-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124415110","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 439
Neuroplanners for hand/eye coordination 手/眼协调神经规划师
International 1989 Joint Conference on Neural Networks Pub Date : 1989-12-01 DOI: 10.1109/IJCNN.1989.118296
D. H. Graf, W. LaLonde
{"title":"Neuroplanners for hand/eye coordination","authors":"D. H. Graf, W. LaLonde","doi":"10.1109/IJCNN.1989.118296","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118296","url":null,"abstract":"The authors generalize a previously described architecture, which they now call a neuroplanner, and apply it to an extension of the problem it was initially designed to solve-the target-directed control of a robot arm in an obstacle-cluttered workspace. By target directed they mean that the arm can position its end-effector at the point of gaze specified by a pair of stereo targetting cameras. Hence, the system is able to 'touch the point targetted by its eyes. The new design extends the targetting system to an articulated camera platform-the equivalent of the human eye-head-neck system. This permits the robot to solve the inverse problem: given the current configuration of the arm, the system is able to reorient the camera platform to focus on the end-effector. Because of obstacles, the camera platform will generally have to peer around obstacles that block its view. Hence the new system is able to move the eye-head-neck system to see the hand.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132586589","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 21
A backpropagation network for classifying auditory brainstem evoked potentials: input level biasing, temporal and spectral inputs and learning patterns 听觉脑干诱发电位分类的反向传播网络:输入水平偏置、时间和频谱输入和学习模式
International 1989 Joint Conference on Neural Networks Pub Date : 1989-12-01 DOI: 10.1109/IJCNN.1989.118422
Dogan Alpsan, can Ozdamar
{"title":"A backpropagation network for classifying auditory brainstem evoked potentials: input level biasing, temporal and spectral inputs and learning patterns","authors":"Dogan Alpsan, can Ozdamar","doi":"10.1109/IJCNN.1989.118422","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118422","url":null,"abstract":"Summary form only given, as follows. The results of an investigation conducted to examine the effects of various input data forms on learning of a neural network for classifying auditory evoked potentials are presented. The long-term objective is to use the classification in an automated device for hearing threshold testing. Feedforward multilayered neural networks trained with the backpropagation method are used. The effects of presenting the data to the neural network in various temporal and spectral modes are explored. Results indicate that temporal and spectral information complement one another and increase performance when used together. Learning curves and dot graphs as they are used in this study may reveal network learning strategies. The nature of such learning patterns found in this study is discussed.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127810920","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
A neuro-expert architecture for object recognition 一个用于对象识别的神经专家架构
International 1989 Joint Conference on Neural Networks Pub Date : 1989-12-01 DOI: 10.1109/IJCNN.1989.118315
J. Selinsky, A. Guez, J. Eilbert, M. Kam
{"title":"A neuro-expert architecture for object recognition","authors":"J. Selinsky, A. Guez, J. Eilbert, M. Kam","doi":"10.1109/IJCNN.1989.118315","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118315","url":null,"abstract":"Summary form only given, as follows. A report is presented on results of experiments in object recognition with a combined neural network/expert system architecture (neuro-expert). The neuro-expert architecture is outlined with a description of the experimental object recognition system. Results are reported for the recognition of a 20-pattern prototype set of synthesized binary images placed at arbitrary rotations. A 100% recognition rate was obtained under noiseless conditions. Addition of 1% and 2% random pixel noise resulted in recognition rates of 95.2% and 89.5%, respectively.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1989-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"131760487","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信