{"title":"Neural networks: binary monotonic and multiple-valued","authors":"J. Zurada","doi":"10.1109/ISMVL.2000.848602","DOIUrl":null,"url":null,"abstract":"This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a \"generalized\" Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons' activation functions.","PeriodicalId":334235,"journal":{"name":"Proceedings 30th IEEE International Symposium on Multiple-Valued Logic (ISMVL 2000)","volume":"28 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 30th IEEE International Symposium on Multiple-Valued Logic (ISMVL 2000)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMVL.2000.848602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6
Abstract
This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a "generalized" Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons' activation functions.