Hamad Younis, Muhammad Hassan, Shahzad Younis, Muhammad Shafique
{"title":"Team of Tiny ANNs: A Way Towards Cost-Efficient Scalable Deep Learning","authors":"Hamad Younis, Muhammad Hassan, Shahzad Younis, Muhammad Shafique","doi":"10.1109/ICAI55435.2022.9773451","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNNs) have latterly accomplished enormous success in various image recognition tasks. Although, training large DNN models are computationally expensive and memory intensive. So the natural idea is to do network compression and acceleration without significantly diminishing the performance of the model. In this paper, we propose a rapid and accurate method of training a neural network that has a small computation time and fewer parameters. The features are extracted using the Discrete Wavelet Transform (DWT) method. A voting-based classifier comprising a team of tiny artificial neural networks is proposed. The proposed classifier combines all the classification votes from the different sub-bands (models) to obtain the final class label, thus, achieving a similar classification accuracy of standard neural network architecture. The experiments were illustrated on benchmark data-sets of MNIST and EMNIST. On MNIST dataset, the trained models achieve the highest accuracy of 93.16 % for original and 90.44 % for Low-Low (LL) sub-band images. On the EMNIST dataset, accuracy of 90.13% for original and 87.40% for LL sub-band images has been obtained, respectively.","PeriodicalId":146842,"journal":{"name":"2022 2nd International Conference on Artificial Intelligence (ICAI)","volume":"94 23 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-03-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 2nd International Conference on Artificial Intelligence (ICAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICAI55435.2022.9773451","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Deep neural networks (DNNs) have latterly accomplished enormous success in various image recognition tasks. Although, training large DNN models are computationally expensive and memory intensive. So the natural idea is to do network compression and acceleration without significantly diminishing the performance of the model. In this paper, we propose a rapid and accurate method of training a neural network that has a small computation time and fewer parameters. The features are extracted using the Discrete Wavelet Transform (DWT) method. A voting-based classifier comprising a team of tiny artificial neural networks is proposed. The proposed classifier combines all the classification votes from the different sub-bands (models) to obtain the final class label, thus, achieving a similar classification accuracy of standard neural network architecture. The experiments were illustrated on benchmark data-sets of MNIST and EMNIST. On MNIST dataset, the trained models achieve the highest accuracy of 93.16 % for original and 90.44 % for Low-Low (LL) sub-band images. On the EMNIST dataset, accuracy of 90.13% for original and 87.40% for LL sub-band images has been obtained, respectively.