{"title":"基于分段LUT的Softmax函数的硬件实现","authors":"Xiao Dong, Xiaolei Zhu, De Ma","doi":"10.1109/IWOFC48002.2019.9078446","DOIUrl":null,"url":null,"abstract":"Deep neural networks (DNN) achieve great results in many fields. While softmax function is widely used in DNN, how to implement it on hardware considering the accuracy, speed, area, and power, is a critical issue. This paper proposes a piecewise exponential lookup table (LUT) method, which reduces the LUT size of the exponential function in DNN's softmax layer. The experiment results show that the hardware using this method consumes less area and power resources than the previous work. The experiment input has a wide range and high accuracy, the absolute error of the calculation result is up to 4.5×10−6. The experiment results prove the proposed design is suitable for the softmax layer in most hardware implementation of DNN.","PeriodicalId":266774,"journal":{"name":"2019 IEEE International Workshop on Future Computing (IWOFC","volume":"28 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-12-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Hardware Implementation of Softmax Function Based on Piecewise LUT\",\"authors\":\"Xiao Dong, Xiaolei Zhu, De Ma\",\"doi\":\"10.1109/IWOFC48002.2019.9078446\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Deep neural networks (DNN) achieve great results in many fields. While softmax function is widely used in DNN, how to implement it on hardware considering the accuracy, speed, area, and power, is a critical issue. This paper proposes a piecewise exponential lookup table (LUT) method, which reduces the LUT size of the exponential function in DNN's softmax layer. The experiment results show that the hardware using this method consumes less area and power resources than the previous work. The experiment input has a wide range and high accuracy, the absolute error of the calculation result is up to 4.5×10−6. The experiment results prove the proposed design is suitable for the softmax layer in most hardware implementation of DNN.\",\"PeriodicalId\":266774,\"journal\":{\"name\":\"2019 IEEE International Workshop on Future Computing (IWOFC\",\"volume\":\"28 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2019-12-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2019 IEEE International Workshop on Future Computing (IWOFC\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IWOFC48002.2019.9078446\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2019 IEEE International Workshop on Future Computing (IWOFC","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IWOFC48002.2019.9078446","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Hardware Implementation of Softmax Function Based on Piecewise LUT
Deep neural networks (DNN) achieve great results in many fields. While softmax function is widely used in DNN, how to implement it on hardware considering the accuracy, speed, area, and power, is a critical issue. This paper proposes a piecewise exponential lookup table (LUT) method, which reduces the LUT size of the exponential function in DNN's softmax layer. The experiment results show that the hardware using this method consumes less area and power resources than the previous work. The experiment input has a wide range and high accuracy, the absolute error of the calculation result is up to 4.5×10−6. The experiment results prove the proposed design is suitable for the softmax layer in most hardware implementation of DNN.