Neural networks: binary monotonic and multiple-valued

J. Zurada
{"title":"Neural networks: binary monotonic and multiple-valued","authors":"J. Zurada","doi":"10.1109/ISMVL.2000.848602","DOIUrl":null,"url":null,"abstract":"This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a \"generalized\" Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons' activation functions.","PeriodicalId":334235,"journal":{"name":"Proceedings 30th IEEE International Symposium on Multiple-Valued Logic (ISMVL 2000)","volume":"28 4 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2000-05-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"6","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings 30th IEEE International Symposium on Multiple-Valued Logic (ISMVL 2000)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ISMVL.2000.848602","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 6

Abstract

This paper demonstrates how conventional neural networks can be modified, extended or generalized by introducing basic notions of multiple-valued logic to the definition of neurons. It has been shown that multilevel neurons produce useful attractor-type neural networks and lead to multistable memory cells. This opens up a possibility of storing a multiplicity of logic levels in a "generalized" Hopfield memory. Another interesting attractor-type network encodes information in complex output values of the neurons, and specifically, in their phase angles. This network working as a memory is able to recognize many stored grey-level values as output of a single neuron. As such, this network represents an extension of bivalent information processors. Multilevel neurons can also be employed in perceptron type classifiers trained with the error backpropagation algorithm. This offers the advantage that the resulting networks are smaller, with fewer weights and neurons to perform typical classification tasks. This improvement is achieved at a cost of considerable enhancement to the neurons' activation functions.
神经网络:二元单调与多值
本文通过在神经元的定义中引入多值逻辑的基本概念,证明了传统神经网络是如何被修改、扩展或推广的。已有研究表明,多层神经元产生有用的吸引子型神经网络,并导致多稳定的记忆细胞。这开启了在“广义”Hopfield存储器中存储多重逻辑层次的可能性。另一个有趣的吸引子类型的网络在神经元的复杂输出值中编码信息,特别是在它们的相位角中。这个作为记忆的网络能够识别许多存储的灰度值作为单个神经元的输出。因此,这个网络代表了二价信息处理器的扩展。多层神经元也可以用于用误差反向传播算法训练的感知器分类器。这样做的好处是得到的网络更小,用更少的权重和神经元来执行典型的分类任务。这种改进是以神经元激活功能的显著增强为代价的。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信