Supervising an Unsupervised Neural Network

T. D. Bui, Duy Khuong Nguyen, Tien Dat Ngo
{"title":"Supervising an Unsupervised Neural Network","authors":"T. D. Bui, Duy Khuong Nguyen, Tien Dat Ngo","doi":"10.1109/ACIIDS.2009.92","DOIUrl":null,"url":null,"abstract":"Machine learning is the field that is dedicated to the design and development of algorithms and techniques that allow computers to “learn”. Two common types of learning that are often mentioned are supervised learning and unsupervised learning. One often understands that in supervised learning, the system is given the desired output, and it is required to produce the correct output for the given input, while in unsupervised learning the system is given only the input and the objective is to find the natural structure inherent in the input data. We, however, suggest that even with unsupervised learning, the information inside the input, the structure of the input, and the sequence that the input is given to the system actually make the learning “supervised” in some way. Therefore, we recommend that in order to make the machine learn, even in a “supervised” manner, we should use an “unsupervised learning” model together with an appropriate way of presenting the input. We propose in this paper a simple plasticity neural network model that has the ability of storing information as well as storing the association between a pair of inputs. We then introduce two simple unsupervised learning rules and a framework to supervise our neural network.","PeriodicalId":275776,"journal":{"name":"2009 First Asian Conference on Intelligent Information and Database Systems","volume":"10878 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2009-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"4","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2009 First Asian Conference on Intelligent Information and Database Systems","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ACIIDS.2009.92","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 4

Abstract

Machine learning is the field that is dedicated to the design and development of algorithms and techniques that allow computers to “learn”. Two common types of learning that are often mentioned are supervised learning and unsupervised learning. One often understands that in supervised learning, the system is given the desired output, and it is required to produce the correct output for the given input, while in unsupervised learning the system is given only the input and the objective is to find the natural structure inherent in the input data. We, however, suggest that even with unsupervised learning, the information inside the input, the structure of the input, and the sequence that the input is given to the system actually make the learning “supervised” in some way. Therefore, we recommend that in order to make the machine learn, even in a “supervised” manner, we should use an “unsupervised learning” model together with an appropriate way of presenting the input. We propose in this paper a simple plasticity neural network model that has the ability of storing information as well as storing the association between a pair of inputs. We then introduce two simple unsupervised learning rules and a framework to supervise our neural network.
监督无监督神经网络
机器学习是一个致力于设计和开发允许计算机“学习”的算法和技术的领域。经常提到的两种常见的学习类型是监督学习和无监督学习。人们通常认为,在监督学习中,系统被给予期望的输出,并且需要对给定的输入产生正确的输出,而在无监督学习中,系统只被给予输入,目标是找到输入数据中固有的自然结构。然而,我们认为,即使使用无监督学习,输入中的信息、输入的结构以及输入给系统的顺序实际上在某种程度上使学习成为“监督”的。因此,我们建议,为了使机器学习,即使以“监督”的方式,我们也应该使用“无监督学习”模型以及适当的输入呈现方式。本文提出了一种简单的可塑性神经网络模型,该模型具有存储信息和存储一对输入之间的关联的能力。然后我们引入两个简单的无监督学习规则和一个框架来监督我们的神经网络。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信