对神经疏导网络培训参数方面的重复神经网性能的比较

Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Bous Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, A
{"title":"对神经疏导网络培训参数方面的重复神经网性能的比较","authors":"Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Bous Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, A","doi":"10.26389/ajsrp.b030821","DOIUrl":null,"url":null,"abstract":"The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters ​​are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.","PeriodicalId":15747,"journal":{"name":"Journal of engineering sciences and information technology","volume":"40 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية\",\"authors\":\"Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Bous Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, A\",\"doi\":\"10.26389/ajsrp.b030821\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters ​​are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.\",\"PeriodicalId\":15747,\"journal\":{\"name\":\"Journal of engineering sciences and information technology\",\"volume\":\"40 1\",\"pages\":\"\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-03-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of engineering sciences and information technology\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.26389/ajsrp.b030821\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of engineering sciences and information technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26389/ajsrp.b030821","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

该研究旨在减少卷积神经网络(CNN)中的参数数量,卷积神经网络是用于提取和分类视频文件中行为特征的最佳技术之一。这些网络的规模非常大,参数数量也非常多,分布在深层,特别是在负责分类的最后一层,在网络训练的每个阶段都要修改参数的值,这消耗了太多的内存,需要非常大的内存空间。在本研究中,我们通过使用轻量级卷积神经网络(LWCNN)来减少参数的数量,我们在研究中选择了Alex Net网络,但我们对其进行了一些修改,我们减少了卷积层中的滤波器数量,并将网络中的最后一层替换为最重要的递归神经网络(RNN)之一。我们分别使用了长短期记忆(LSTM)和门控循环单元(GRU)。在研究期间,这项工作在一个包含960个正常儿童和自闭症儿童(视频)的数据集上进行了测试,这些数据集是在特殊需求人群的社会心理支持中心拍摄的。实验结果表明,轻量网络与84%的递归网络连接后,系统参数数量明显减少,且长可靠度递归网络(LSTM)在精度和损失值方面优于门控递归单元(GRU)。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية
The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters ​​are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信