{"title":"一种基于内存的基于LUT和RALUT的Tanh数字实现方法","authors":"Samira Sorayassa, M. Ahmadi","doi":"10.5121/csit.2022.122204","DOIUrl":null,"url":null,"abstract":"Tangent Hyperbolic (Tanh) has been used as a preferred activation function in implementing a multi-layer neural network. The differentiability of this function makes it suitable for derivativebased learning algorithm such as error back propagation technique. In this paper two different memory-based techniques for accurate approximation and digital implementation of the Tanh function using Look Up Table (LUT) and Range Addressable Look Up Table (RALUT) are given. A thorough comparative study of the two techniques in terms of their hardware resource usage on FPGA and their accuracies are explained. The schematic of the synthesized design for special cased are given as an example.","PeriodicalId":153862,"journal":{"name":"Signal Processing and Vision","volume":"54 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A Memory Based Approach for Digital Implementation of Tanh using LUT and RALUT\",\"authors\":\"Samira Sorayassa, M. Ahmadi\",\"doi\":\"10.5121/csit.2022.122204\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Tangent Hyperbolic (Tanh) has been used as a preferred activation function in implementing a multi-layer neural network. The differentiability of this function makes it suitable for derivativebased learning algorithm such as error back propagation technique. In this paper two different memory-based techniques for accurate approximation and digital implementation of the Tanh function using Look Up Table (LUT) and Range Addressable Look Up Table (RALUT) are given. A thorough comparative study of the two techniques in terms of their hardware resource usage on FPGA and their accuracies are explained. The schematic of the synthesized design for special cased are given as an example.\",\"PeriodicalId\":153862,\"journal\":{\"name\":\"Signal Processing and Vision\",\"volume\":\"54 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-12-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Signal Processing and Vision\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.5121/csit.2022.122204\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signal Processing and Vision","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5121/csit.2022.122204","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
A Memory Based Approach for Digital Implementation of Tanh using LUT and RALUT
Tangent Hyperbolic (Tanh) has been used as a preferred activation function in implementing a multi-layer neural network. The differentiability of this function makes it suitable for derivativebased learning algorithm such as error back propagation technique. In this paper two different memory-based techniques for accurate approximation and digital implementation of the Tanh function using Look Up Table (LUT) and Range Addressable Look Up Table (RALUT) are given. A thorough comparative study of the two techniques in terms of their hardware resource usage on FPGA and their accuracies are explained. The schematic of the synthesized design for special cased are given as an example.