{"title":"Fault injection attacks on SoftMax function in deep neural networks","authors":"Dirmanto Jap, Yoo-Seung Won, S. Bhasin","doi":"10.1145/3457388.3458870","DOIUrl":null,"url":null,"abstract":"Softmax is commonly used activation function in neural networks to normalize the output to probability distribution over predicted classes. Being often deployed in the output layer, it can potentially be targeted by fault injection attacks to create misclassification. In this extended abstract, we perform a preliminary fault analysis of Softmax against single bit faults.","PeriodicalId":136482,"journal":{"name":"Proceedings of the 18th ACM International Conference on Computing Frontiers","volume":"13 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-05-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Proceedings of the 18th ACM International Conference on Computing Frontiers","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1145/3457388.3458870","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 5
Abstract
Softmax is commonly used activation function in neural networks to normalize the output to probability distribution over predicted classes. Being often deployed in the output layer, it can potentially be targeted by fault injection attacks to create misclassification. In this extended abstract, we perform a preliminary fault analysis of Softmax against single bit faults.