{"title":"Interconnection Tensor Rank and the Neural Network Storage Capacity","authors":"B. V. Kryzhanovsky","doi":"10.3103/S1060992X25600272","DOIUrl":null,"url":null,"abstract":"<p>Neural network properties are considered in the case of the interconnection tensor rank being higher than two (i.e., when in addition to the synaptic connection matrix, there are presynaptic synapses, pre-presynaptic synapses, etc.). This sort of interconnection tensor occurs in realization of crossbar-based neural networks. It is intrinsic for a crossbar design to suffer from parasitic currents: when a signal travels along a connection to a certain neuron, a part of it always passes to other neurons’ connections through memory cells (synapses). As a result, a signal at the neuron input holds noise—other weak signals going to all other neurons. It means that the conductivity of an analog crossbar cell varies proportionally to the noise signal, and the cell output signal becomes nonlinear. It is shown that the interconnection tensor of a certain form makes the neural network much more efficient: the storage capacity and basin of attraction of the network increase considerably. A network like the Hopfield one is used in the study.</p>","PeriodicalId":721,"journal":{"name":"Optical Memory and Neural Networks","volume":"34 2","pages":"181 - 187"},"PeriodicalIF":0.8000,"publicationDate":"2025-07-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Optical Memory and Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://link.springer.com/article/10.3103/S1060992X25600272","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q4","JCRName":"OPTICS","Score":null,"Total":0}
引用次数: 0
Abstract
Neural network properties are considered in the case of the interconnection tensor rank being higher than two (i.e., when in addition to the synaptic connection matrix, there are presynaptic synapses, pre-presynaptic synapses, etc.). This sort of interconnection tensor occurs in realization of crossbar-based neural networks. It is intrinsic for a crossbar design to suffer from parasitic currents: when a signal travels along a connection to a certain neuron, a part of it always passes to other neurons’ connections through memory cells (synapses). As a result, a signal at the neuron input holds noise—other weak signals going to all other neurons. It means that the conductivity of an analog crossbar cell varies proportionally to the noise signal, and the cell output signal becomes nonlinear. It is shown that the interconnection tensor of a certain form makes the neural network much more efficient: the storage capacity and basin of attraction of the network increase considerably. A network like the Hopfield one is used in the study.
期刊介绍:
The journal covers a wide range of issues in information optics such as optical memory, mechanisms for optical data recording and processing, photosensitive materials, optical, optoelectronic and holographic nanostructures, and many other related topics. Papers on memory systems using holographic and biological structures and concepts of brain operation are also included. The journal pays particular attention to research in the field of neural net systems that may lead to a new generation of computional technologies by endowing them with intelligence.