Training of Multi-layered Neural Network for Data Enlargement Processing Using an Activity Function

Betere Job Isaac, H. Kinjo, K. Nakazono, Naoki Oshiro
{"title":"Training of Multi-layered Neural Network for Data Enlargement Processing Using an Activity Function","authors":"Betere Job Isaac, H. Kinjo, K. Nakazono, Naoki Oshiro","doi":"10.17265/2328-2223/2019.01.001","DOIUrl":null,"url":null,"abstract":"In this paper, we present a study on activity functions for an MLNN (multi-layered neural network) and propose a suitable activity function for data enlargement processing. We have carefully studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L & exp. activity functions for few inputs to multiple output training patterns. Our MLNNs model has L hidden layers with two or three inputs to four or six outputs data variations by BP (backpropagation) NN (neural network) training. We focused on the multi teacher training signals to investigate and evaluate the training performance in MLNNs to select the best and good activity function for data enlargement and hence could be applicable for image and signal processing (synaptic divergence) along with the proposed methods with convolution networks. We specifically used four activity functions from which we found out that L & exp. activity function can suite DENN (data enlargement neural network) training since it could give the highest percentage training abilities compared to the other activity functions of Sigmoid, ReLu and Leaky-ReLu during simulation and training of data in the network. And finally, we recommend L & exp. function to be good for MLNNs and may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple teacher training patterns using original generated data and hence can be tried with CNN (convolution neural networks) of image processing.","PeriodicalId":382952,"journal":{"name":"J. of Electrical Engineering","volume":"49 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-02-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"J. of Electrical Engineering","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.17265/2328-2223/2019.01.001","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

In this paper, we present a study on activity functions for an MLNN (multi-layered neural network) and propose a suitable activity function for data enlargement processing. We have carefully studied the training performance of Sigmoid, ReLu, Leaky-ReLu and L & exp. activity functions for few inputs to multiple output training patterns. Our MLNNs model has L hidden layers with two or three inputs to four or six outputs data variations by BP (backpropagation) NN (neural network) training. We focused on the multi teacher training signals to investigate and evaluate the training performance in MLNNs to select the best and good activity function for data enlargement and hence could be applicable for image and signal processing (synaptic divergence) along with the proposed methods with convolution networks. We specifically used four activity functions from which we found out that L & exp. activity function can suite DENN (data enlargement neural network) training since it could give the highest percentage training abilities compared to the other activity functions of Sigmoid, ReLu and Leaky-ReLu during simulation and training of data in the network. And finally, we recommend L & exp. function to be good for MLNNs and may be applicable for signal processing of data and information enlargement because of its performance training characteristics with multiple teacher training patterns using original generated data and hence can be tried with CNN (convolution neural networks) of image processing.
基于活动函数的多层神经网络数据扩展处理训练
本文对多层神经网络的活动函数进行了研究,提出了一种适合于数据扩展处理的活动函数。我们仔细研究了Sigmoid、ReLu、Leaky-ReLu和L & exp. activity函数在少输入多输出训练模式下的训练性能。我们的mlnn模型有L个隐藏层,通过BP(反向传播)NN(神经网络)训练,具有2或3个输入到4或6个输出数据变化。我们关注多教师训练信号,研究和评估mlnn的训练性能,以选择最佳和良好的活动函数进行数据扩展,从而可以适用于图像和信号处理(突触发散),以及与卷积网络一起提出的方法。我们特别使用了四个活动函数,从中我们发现L和exp.活动函数可以适合DENN(数据扩展神经网络)训练,因为在模拟和训练网络中的数据时,与Sigmoid, ReLu和Leaky-ReLu的其他活动函数相比,它可以提供最高百分比的训练能力。最后,我们推荐L & exp.函数适合于mlnn,由于其使用原始生成的数据进行多种教师训练模式的性能训练特点,可能适用于数据的信号处理和信息扩展,因此可以与图像处理的CNN(卷积神经网络)一起尝试。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信