{"title":"A Computationally Light Pruning Strategy for Single Layer Neural Networks based on Threshold Function","authors":"E. Ragusa, C. Gianoglio, R. Zunino, P. Gastaldo","doi":"10.1109/ICECS46596.2019.8964894","DOIUrl":null,"url":null,"abstract":"Embedded machine learning relies on inference functions that can fit resource-constrained, low-power computing devices. The literature proves that single layer neural networks using threshold functions can provide a suitable trade off between classification accuracy and computational cost. In this regard, the number of neurons directly impacts both on computational complexity and on resources allocation. Thus, the present research aims at designing an efficient pruning technique that can take into account the peculiarities of the threshold function. The paper shows that feature selection criteria based on filter models can effectively be applied to neuron selection. In particular, valuable outcomes can be obtained by designing ad-hoc objective functions for the selection process. An extensive experimental campaign confirms that the proposed objective function compares favourably with state-of-the-art pruning techniques.","PeriodicalId":209054,"journal":{"name":"2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS)","volume":"55 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-11-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 26th IEEE International Conference on Electronics, Circuits and Systems (ICECS)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/ICECS46596.2019.8964894","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Embedded machine learning relies on inference functions that can fit resource-constrained, low-power computing devices. The literature proves that single layer neural networks using threshold functions can provide a suitable trade off between classification accuracy and computational cost. In this regard, the number of neurons directly impacts both on computational complexity and on resources allocation. Thus, the present research aims at designing an efficient pruning technique that can take into account the peculiarities of the threshold function. The paper shows that feature selection criteria based on filter models can effectively be applied to neuron selection. In particular, valuable outcomes can be obtained by designing ad-hoc objective functions for the selection process. An extensive experimental campaign confirms that the proposed objective function compares favourably with state-of-the-art pruning techniques.