Building a Family of Neural Networks using Symmetry as a Foundation

R. Neville, Liping Zhao
{"title":"Building a Family of Neural Networks using Symmetry as a Foundation","authors":"R. Neville, Liping Zhao","doi":"10.1109/IJCNN.2007.4370922","DOIUrl":null,"url":null,"abstract":"In order to perform a function mapping task, a neural network needs two supporting mechanisms: an input and an output training vector, and a training regime. A new approach is proposed to generating a family of neural networks for performing a set of related functions. Within a family, only one network needs to be trained to perform an input-output function mapping task and other networks can be derived from this trained base network without training. The base net thus acts as a generator of the derived nets. The proposed approach builds on three mathematical foundations: (1) symmetry for defining the relationship between functions; (2) weight transformations for generating a family of networks; (3) Euclidian distance function for measuring the symmetric relationships between the related functions. The proposed approach provides a formal foundation for systemic information reuse in ANNs.","PeriodicalId":350091,"journal":{"name":"2007 International Joint Conference on Neural Networks","volume":"16 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2007-10-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2007 International Joint Conference on Neural Networks","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN.2007.4370922","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

In order to perform a function mapping task, a neural network needs two supporting mechanisms: an input and an output training vector, and a training regime. A new approach is proposed to generating a family of neural networks for performing a set of related functions. Within a family, only one network needs to be trained to perform an input-output function mapping task and other networks can be derived from this trained base network without training. The base net thus acts as a generator of the derived nets. The proposed approach builds on three mathematical foundations: (1) symmetry for defining the relationship between functions; (2) weight transformations for generating a family of networks; (3) Euclidian distance function for measuring the symmetric relationships between the related functions. The proposed approach provides a formal foundation for systemic information reuse in ANNs.
以对称为基础构建神经网络家族
为了完成函数映射任务,神经网络需要两个支持机制:输入和输出训练向量,以及训练机制。提出了一种新的方法来生成一组神经网络来执行一组相关的功能。在一个家庭中,只需要训练一个网络来执行输入-输出函数映射任务,其他网络可以从这个训练好的基础网络中得到,而无需训练。因此,基网充当衍生网的生成器。提出的方法建立在三个数学基础上:(1)定义函数之间关系的对称性;(2)生成网络族的权值变换;(3)欧几里得距离函数,用于度量相关函数之间的对称关系。该方法为人工神经网络的系统信息重用提供了形式化的基础。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信