International 1989 Joint Conference on Neural Networks最新文献

筛选
英文 中文
A neuromorphic learning strategy for the control of a one-legged hopping machine 控制单腿跳跃机器的神经形态学习策略
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118499
J. Helferty, J. Collins, M. Kam
{"title":"A neuromorphic learning strategy for the control of a one-legged hopping machine","authors":"J. Helferty, J. Collins, M. Kam","doi":"10.1109/IJCNN.1989.118499","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118499","url":null,"abstract":"Summary form only given, as follows. An adaptive, neural network strategy is described for the control of a dynamic, locomotive system, in particular a one-legged hopping robot. The control task is to make corrections to the motion of the robot that serve to maintain a fixed level of energy (and minimize energy losses). While for many dynamic systems energy conservation may not be a key control criterion, legged locomotion is an energy intensive activity, implying that energy conservation is a primary issue in control considerations. The authors effect the control of the robot by the use of an artificial neural network (ANN) with a continuous learning memory. Results are presented in the form of computer simulations that demonstrate the ANN's ability to devise proper control signals that will develop a stable hopping strategy using imprecise knowledge of the current state of the robotic leg.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"43 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124727362","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 4
Integration of neural heuristics into knowledge-based inference 基于知识推理的神经启发式集成
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1080/09540098908915644
L. Fu
{"title":"Integration of neural heuristics into knowledge-based inference","authors":"L. Fu","doi":"10.1080/09540098908915644","DOIUrl":"https://doi.org/10.1080/09540098908915644","url":null,"abstract":"Summary form only given, as follows. Analogies observed between relief networks and neural networks lead to the plausibility of introducing heuristics developed under the neural network approach into knowledge-based systems. An approach has been developed that maps a rule-based system into the neural architecture in both the structural and the behavioral aspects. Under this approach, the knowledge base and the inference engine are mapped into an entity called conceptualization, where a node represents a concept and a link represents a relation between two concepts. Inference in the conceptualization involves the propagation and combination of activations as well as maximizing information transmission through layers. Learning is based upon a mechanism called backpropagation, which allows proper modification of the connection strengths in order to be adapted to the environment. Finally, the validity of this approach has been demonstrated by experiments.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129565213","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 104
Optimal task assignment using a neural network 基于神经网络的最优任务分配
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118372
T. Tanaka, J. R. Canfield, S. Oyanagi, H. Genchi
{"title":"Optimal task assignment using a neural network","authors":"T. Tanaka, J. R. Canfield, S. Oyanagi, H. Genchi","doi":"10.1109/IJCNN.1989.118372","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118372","url":null,"abstract":"Summary form only given. A neural network is described that solves the problem of optimally assigning tasks to processors in a message-passing parallel machine. This task assignment problem (TAP) is defined by creating a task assignment cost function that expresses the cost of communication overhead and load imbalance. TAP is a kind of combinatorial optimization problem which can be solved efficiently by using a neural network, but the Hopfield and Tank approach has certain limitations. The authors have solved these two problems by use of an improved Hopfield model network. By representing TAP in a more direct manner in the neural network, the need for constraints is eliminated, a valid solution is guaranteed, and the number of neurons and connections needed is reduced substantially.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127026882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 3
A parametric model for synthesis of cortical column patterns 皮质柱型综合的参数化模型
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118416
A. Rojer, E. Schwartz
{"title":"A parametric model for synthesis of cortical column patterns","authors":"A. Rojer, E. Schwartz","doi":"10.1109/IJCNN.1989.118416","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118416","url":null,"abstract":"Summary form only given. The authors introduce a parametric model for columnar structure which considers the spatial form in an image-processing framework. This method permits easy synthesis of column-like structure from noise images. In particular, bandpass filtering of noise images followed by thresholding yields patterns which strongly resemble the columnar structure that has been observed in the brain. The image-oriented technique is flexible and inexpensive to compute. There are only a few independent parameters, and the role they play in column formation is apparent. The parameters for a particular column system can be readily determined from actual brain data by the use of standard image-processing techniques. The authors have used the model to process data obtained in their computer reconstruction of the pattern of ocular dominance columns in the macaque monkey. This approach avoids the necessity of constructing computationally expensive cellular models which are based on poorly understood details of neural development. The authors provide an efficient, accurate model which can be adjusted to fit a wide variety of column data.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"40 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127497468","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 7
Is backpropagation biologically plausible? 反向传播在生物学上可信吗?
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118705
D. Stork, Jordan Hall
{"title":"Is backpropagation biologically plausible?","authors":"D. Stork, Jordan Hall","doi":"10.1109/IJCNN.1989.118705","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118705","url":null,"abstract":"The author searches for neurobiologically plausible implementations of the backpropagation gradient descent algorithm. Any such implementation must be consistent with physical constraints such as locality (i.e., that the behavior of any component can be influenced solely by components in physical contact with it) and contingent facts of biology, and must also preserve global network properties such as fault tolerance, stability, and graceful degradation to hardware errors. The authors finds that in several posited implementations these design considerations imply that a finely structured neural connectivity is needed as well as a number of neurons and synapses beyond those inferred from the algorithmic network presentations of backpropagation. Gating synapses (Sigma-Pi units) are present while Hebbian (or pseudo-Hebbian) synapses are absent from all his posited implementations. Although backpropagation can in principle be implemented in neurobiology, such high network structure and the organizational principles required for its generation at the level of individual neurons will require more support from experimental neurobiology.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"17 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130265673","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 86
Pattern classification using trainable logic networks 使用可训练逻辑网络的模式分类
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118599
B. W. Evans
{"title":"Pattern classification using trainable logic networks","authors":"B. W. Evans","doi":"10.1109/IJCNN.1989.118599","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118599","url":null,"abstract":"The author describes a new pattern classification algorithm which has the simplicity of the well-known multilinear classifier but is capable of learning patterns through supervised training. This is achieved by replacing the discretely valued logic functions employed in the conventional classifier with continuous extensions. The resulting differentiable relationship between network parameters and outputs permits the use of gradient descent methods to select optimal classifier parameters. This classifier can be implemented as a network whose structure is well suited to highly parallel hardware implementation. Essentially, the same network can be used both to compute weight adjustments and perform classifications, so that the same hardware could be used for both rapid training and classification. The author has applied this classifier to a noisy parity detection problem. The classification error frequency obtained in this example compares favourably with the theoretical lower bound.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128788325","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 0
Brain-state-in-a-box neural networks with asymmetric coefficients 具有非对称系数的脑盒状态神经网络
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118642
L. Vandenberghe, J. Vandewalle
{"title":"Brain-state-in-a-box neural networks with asymmetric coefficients","authors":"L. Vandenberghe, J. Vandewalle","doi":"10.1109/IJCNN.1989.118642","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118642","url":null,"abstract":"The equilibrium condition for brain-state-in-a-box neural networks is formulated as a variational inequality, well known in operations research and mathematical programming as a unified description of many equilibrium problems. In the case of symmetric coefficients, this variational inequality coincides with the first-order necessary conditions for minimality of the energy function of the neural net, but it is also valid if the coefficients are not symmetric. In that case, it leads to an appealing interpretation of equilibrium as a solution of a multiple-objective optimization problem. This study also provides conditions for uniqueness and global stability of the equilibrium state without assumption of symmetry.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"7 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126703457","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 6
Sensor calibration using artificial neural networks 传感器的人工神经网络标定
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118324
O. Masory, A. L. Aguirre
{"title":"Sensor calibration using artificial neural networks","authors":"O. Masory, A. L. Aguirre","doi":"10.1109/IJCNN.1989.118324","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118324","url":null,"abstract":"Summary form only given, as follows. The calibration of a 2-D displacement sensor that suffers from nonlinearities and crosstalking using an artificial neural network (ANN) is described. The ANN is used as a pattern associator that is trained to perform the mapping between the sensor's readings and the actual sensed properties. For comparison purposes a few methods were explored: a three-layer ANN, with a different number of hidden units, trained by the backpropagation method; a cerebellar model arithmetic computer with a fixed number of quantizing functions; and a polynomial curve fitting technique. The results of the calibration procedure and recommendation are discussed.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"24 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126730216","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 1
A neural network for 3-satisfiability problems 3-可满足性问题的神经网络
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118356
W.-T. Chen, Keun-Rong Hsieh
{"title":"A neural network for 3-satisfiability problems","authors":"W.-T. Chen, Keun-Rong Hsieh","doi":"10.1109/IJCNN.1989.118356","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118356","url":null,"abstract":"Summary form only given, as follows. Based on Hopfield's associative memory model, a scheme for solving 3-satisfiability (3-SAT) problems is proposed. For problems such as 3-SAT, the partial constraints are easy to determine, but the global constraint is hard to find. The neural network associative memory is viewed as some kind of active memory, which means that it does not just memorize data items, but also manipulates those stored data. The operations that it can perform can be considered as constraint satisfaction. Thus, it is possible to store partial assignments which satisfy the local constraints of the problem and let the memory compose complete assignments which satisfy the global constraints. Simulation results show that this scheme can solve most instances of 3-SAT.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"14 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126362588","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 2
Combining self-organizing maps 组合自组织映射
International 1989 Joint Conference on Neural Networks Pub Date : 1900-01-01 DOI: 10.1109/IJCNN.1989.118289
H. Ritter
{"title":"Combining self-organizing maps","authors":"H. Ritter","doi":"10.1109/IJCNN.1989.118289","DOIUrl":"https://doi.org/10.1109/IJCNN.1989.118289","url":null,"abstract":"The author proposed a learning rule for a single-layer network of modules representing adaptive tables of the type formed by T. Kohonen's vector quantization algorithm (Rep. TKK-F-A601, Helsinki Univ. of Technol., 1986). The learning rule allows combination of several modules to learn more complicated functions on higher dimensional spaces. During learning each module learns a function, which is adjusted such as to minimize the average square error between the correct function and the function represented by the network. Although this is a single-layer system, the capability of each module to learn an arbitrary nonlinearity gives the system far more flexibility than a perceptron. At the same time, for output nonlinearities that are a product or a sum of monotonous functions of their arguments there is a unique minimum to which the system is guaranteed to converge.<<ETX>>","PeriodicalId":199877,"journal":{"name":"International 1989 Joint Conference on Neural Networks","volume":"120 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1900-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"122292736","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
引用次数: 28
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
相关产品
×
本文献相关产品
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信