Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية
Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Bous Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, A
{"title":"Comparison The Performance of The Recurrent Neural Network in Reducing Training Parameters for Convolution Neural Network: مقارنة أداء الشبكات العصبونية التكرارية في تخفيض بارامترات التدريب لشبكات التلافيف العصبية","authors":"Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, Amer Bous Mohamad Ahmad Mounir Batikh, Mohamad Ayman Nael, A","doi":"10.26389/ajsrp.b030821","DOIUrl":null,"url":null,"abstract":"The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.","PeriodicalId":15747,"journal":{"name":"Journal of engineering sciences and information technology","volume":"40 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2022-03-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of engineering sciences and information technology","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.26389/ajsrp.b030821","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The study aims to reduce the number of parameters in the Convolution Neural Network (CNN), which is one of the best techniques used to extract and categorize behavioral features in video files. These networks have a very big size and a large number of parameters which distributed in the deep layers, especially in the last layers that responsible for classification, the values of the parameters are modified at each stage of network training, which consume memory too much, and its need for a very large memory space. In this research we work to reduce the number of parameters through using lightweight Convolution Neural Network (LWCNN), we choose Alex Net network in our research, but we made some modification on it, we decrease the number of filters in convolution layer, and we replace the last layers in the network with one of the most important types of Recurrent Neural Network (RNN). We use each of Long Short Term Memory (LSTM), and Gated Recurrent Unit (GRU). The work was tested during the research period on a dataset containing 960(videos) for normal children and children with autism spectrum, which were taken in Center for psychosocial support for people with special needs. And the experimental results proved the significant decrease in the number of parameters in the system with lightweight networks after linking them with recurrent networks with 84%, as well as the recurrent network with long reliability (LSTM) gave better results than the Gated recurrent unit (GRU) in accuracy and the loss value.