L. Spaanenburg, P.E. deHaan, S. Neusser, J. Nijhuis, A. Siggelkow
{"title":"ASIC-based development of cellular neural networks","authors":"L. Spaanenburg, P.E. deHaan, S. Neusser, J. Nijhuis, A. Siggelkow","doi":"10.1109/CNNA.1990.207523","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207523","url":null,"abstract":"The design and realization of cellular neural networks is discussed. An extension to the development environment NNSIM (neural network simulator) supports the creation of cellular networks. Furthermore the IC/sup 3/ ASIC cell library is introduced, allowing fast turn-around mixed hardware/software prototyping to detail real-time effects. A typical ASIC contains 150 neurons and 750 synapses.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116199165","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cellular neural networks as a model of associative memories","authors":"S. Tan, J. Hao, J. Vandewalle","doi":"10.1109/CNNA.1990.207504","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207504","url":null,"abstract":"Concerns the design of cellular neural networks intended to function as associative memories. The authors consider a discrete-time version of cellular neural nets featuring simple linear thresholding neurons and the synchronous state-updating rule. The Hebbian rule is adopted as the memory design rule. Important issues, such as the memory capacity and the size of the attracting basin, are discussed. The validity of the method is illustrated by a simple example.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133106329","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Optical associative memory with invariances","authors":"G. G. Yang","doi":"10.1109/CNNA.1990.207534","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207534","url":null,"abstract":"A single-slab second order neural network model with scale and translation invariances is proposed. This is based on the backpropagation learning rule by using group theory to impose the invariances to the network. The results show that full range translation invariance and a limited range of scale invariance are realisable. The performance of outer-product model with invariance is analysed. Then an inner-product associative memory model with translation invariance is proposed. A strictly increasing nonlinear operation which is cooperated with a sign function is chosen to guarantee the correct recall convergence of the network without iteration. The performance of network is improved drastically. These two models can be implemented by simple optical systems with parallel processing capability.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"64 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123620309","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Hardware and algorithms for the functional evaluation of cellular neural networks and analog arrays","authors":"K. R. Krieg, L. Chua","doi":"10.1109/CNNA.1990.207521","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207521","url":null,"abstract":"Analog arrays are a generalization of cellular neural networks (CNN) which consist of a regular array of nonlinear analog processors at each node and nearest neighbor interactions. Analog arrays can incorporate nonlinearities in both the input and output functions and, contrasted with CNN arrays, have continuous-valued outputs in the equilibrium state. The general analog array, though more powerful than the CNN, presents a functional test nightmare. Since the output is continuous-valued (even at equilibrium) and the dynamics can be complicated, evaluating whether a fabricated VLSI array complies with the intended processing function for a wide range of inputs can be very difficult and time consuming. The number of analog inputs and outputs also strains most modern analog VLSI automatic test equipment (ATE). The authors present both a new hardware design for massive analog circuit testing and algorithms for functional test of analog arrays.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"15 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124785830","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A possible transformation of the fully connected neural nets into partially connected networks","authors":"J. Levendovszky","doi":"10.1109/CNNA.1990.207507","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207507","url":null,"abstract":"Realizing a neural network (NN) with a large number of interconnections meets severe difficulties in the case of VLSI implementation. Therefore, solving tasks by NN containing a lot of nodes involves an acute realization problem. Thus, the minimization of the number of interconnections is a fundamental problem of NN research. The cellular approach, to solve problems by using partially connected networks in which each neuron 'communicates' with a certain number of neighbouring ones, or at least a noncellular method to reduce the number of interconnections regardless of the neighbouring configuration, is considered. Both concepts of minimization are depicted. There is no general method to transform the original problem to an equivalent one which can be solved by a cellular or partially connected network under some invariancy criteria guaranteeing the same solution as it was achieved by the original net. This paper provides a method and an exact procedure for accomplishing this optimization in the sense of minimizing the number of interconnections. However, the number of computations needed grows extremely fast with respect to the number of nodes, which prevents practical application to problems with large complexity.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"70 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115372372","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"COGNI-neocognitron simulation software","authors":"A. Klofutar","doi":"10.1109/CNNA.1990.207508","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207508","url":null,"abstract":"The simulation software COGNI simulates the pattern recognition neural network neocognitron of K. Fukushima (1982). Due to its complexity, simulations can be carried out only on relatively powerful computer systems which are capable of high speed numeric processing and graphic display. There are two versions available, using the IBM PC-AT and the mu VAX II. Neocognitron is able to learn without a teacher. The response of the last layer in forward (afferent) paths is not affected by the pattern's position or by a small change in the shape or size of the stimulus pattern. Even stimuli corrupted with noise are successfully recognized. The autoassociation is also achieved in the last layer of backward (efferent) paths i.e. in the autoassociation plane.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"50 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"116700358","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A cortex-like architecture of a cellular neural network","authors":"J. Henseler, P.J. Braspenning","doi":"10.1109/CNNA.1990.207529","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207529","url":null,"abstract":"The design of an artificial neural architecture based on the organisation of the cerebral cortex is described. Artificial neurons are placed on a three-dimensional grid and have three types of connections: lateral excitatory, lateral inhibitory and long-range associating connections. The lateral connections realize a relaxation process similar to a three-dimensional cellular neural network process. The long-range connections constitute a form of associative memory correlating neuron-activity patterns in different areas of the network. A three-stage neuron processing model is also introduced. Based on this model a family of processing functions for neurons in the artificial network is described. The family contains existing functions (e.g. characterized by linear or integrator equations) as well as novel functions (e.g. characterized by oscillator equations).<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"344 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"133991202","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A neural network architecture for detecting moving objects. II","authors":"V. Cimagalli","doi":"10.1109/CNNA.1990.207515","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207515","url":null,"abstract":"For pt.I see Proc. of the 3rd Italian Workshop of Parallel Architectures and Neural Networks. Summary form only given. In pt.I the author proposed an architecture for solving a problem of processing time-varying inputs. In that architecture, the signal is processed in a spatio-temporal dimension. Time is not the independent variable in the solution of a set of differential equations as in the classical case, but it plays an essential role in the interaction on the time-varying input and its processing. The purpose of the net is not, as usually, to classify and/or recognize patterns, nor to solve a problem of minimum energy, but to detect some characteristics of a signal varying with respect both to time and space. Such a network has been proved useful in solving the problem of detecting moving objects in a cluster. In this part, the architecture of the net is outlined and its performance is discussed together with its similarities and differences with respect to cellular neural networks. Results of computer simulations are given and the problem of hardware implementation is considered.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"52 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126227006","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Comparison of different numerical integration methods for simulating cellular neural networks","authors":"H. Harrer, A. Schuler, E. Amelunxen","doi":"10.1109/CNNA.1990.207519","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207519","url":null,"abstract":"Three popular single step algorithms for solving the system of nonlinear differential equations of cellular neural networks are compared. The typical behaviour of these algorithms is described, example simulations are given and their relative advantages and disadvantages are discussed.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"41 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"125866151","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Neural networks in minimax programming","authors":"S. Osowski","doi":"10.1109/CNNA.1990.207531","DOIUrl":"https://doi.org/10.1109/CNNA.1990.207531","url":null,"abstract":"The application of the neural computing network concept to the minimax optimization is presented. According to the method the minimax programming problem is first transformed to the standard single-objective optimization problem and then solved by transforming it to the set of ordinary differential equations. The clustered connection-type interpretation of the neural-based minimax approach is given. The numerical results of some chosen examples are also presented.<<ETX>>","PeriodicalId":142909,"journal":{"name":"IEEE International Workshop on Cellular Neural Networks and their Applications","volume":"47 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"1990-12-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127633671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}