用DCT插值实现Tanh函数的可配置FPGA

A. Abdelsalam, J. Langlois, F. Cheriet
{"title":"用DCT插值实现Tanh函数的可配置FPGA","authors":"A. Abdelsalam, J. Langlois, F. Cheriet","doi":"10.1109/FCCM.2017.12","DOIUrl":null,"url":null,"abstract":"Efficient implementation of non-linear activationfunctions is essential to the implementation of deep learningmodels on FPGAs. We introduce such an implementation basedon the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmeticoperations on the stored samples of the hyperbolic tangentfunction and on input data. It achieves almost 3 better precisionthan previous works while using a similar amount computationalresources and a small amount of memory. Various combinationsof DCTIF parameters can be chosen to trade off the accuracy andthe overall circuit complexity of the tanh function. In one case, the proposed architecture approximates the hyperbolic tangentactivation function with 0.004 maximum error while requiringonly 1.45 kbits BRAM memory and 21 LUTs of a Virtex-7 FPGA.","PeriodicalId":124631,"journal":{"name":"2017 IEEE 25th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)","volume":"18 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2017-04-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"13","resultStr":"{\"title\":\"A Configurable FPGA Implementation of the Tanh Function Using DCT Interpolation\",\"authors\":\"A. Abdelsalam, J. Langlois, F. Cheriet\",\"doi\":\"10.1109/FCCM.2017.12\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Efficient implementation of non-linear activationfunctions is essential to the implementation of deep learningmodels on FPGAs. We introduce such an implementation basedon the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmeticoperations on the stored samples of the hyperbolic tangentfunction and on input data. It achieves almost 3 better precisionthan previous works while using a similar amount computationalresources and a small amount of memory. Various combinationsof DCTIF parameters can be chosen to trade off the accuracy andthe overall circuit complexity of the tanh function. In one case, the proposed architecture approximates the hyperbolic tangentactivation function with 0.004 maximum error while requiringonly 1.45 kbits BRAM memory and 21 LUTs of a Virtex-7 FPGA.\",\"PeriodicalId\":124631,\"journal\":{\"name\":\"2017 IEEE 25th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)\",\"volume\":\"18 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2017-04-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"13\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2017 IEEE 25th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FCCM.2017.12\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2017 IEEE 25th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FCCM.2017.12","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 13

摘要

非线性激活函数的有效实现对于在fpga上实现深度学习模型至关重要。我们介绍了一种基于离散余弦变换插值滤波器(DCTIF)的实现。所提出的插值结构结合了对存储的双曲切线函数样本和输入数据的简单算术运算。在使用相同数量的计算资源和少量内存的情况下,它的精度比以前的工作提高了近3倍。DCTIF参数的各种组合可以选择,以权衡tanh函数的精度和整体电路的复杂性。在一种情况下,所提出的架构近似于双曲切线激活函数,最大误差为0.004,而只需要1.45 kb的BRAM内存和Virtex-7 FPGA的21个lut。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
A Configurable FPGA Implementation of the Tanh Function Using DCT Interpolation
Efficient implementation of non-linear activationfunctions is essential to the implementation of deep learningmodels on FPGAs. We introduce such an implementation basedon the Discrete Cosine Transform Interpolation Filter (DCTIF). The proposed interpolation architecture combines simple arithmeticoperations on the stored samples of the hyperbolic tangentfunction and on input data. It achieves almost 3 better precisionthan previous works while using a similar amount computationalresources and a small amount of memory. Various combinationsof DCTIF parameters can be chosen to trade off the accuracy andthe overall circuit complexity of the tanh function. In one case, the proposed architecture approximates the hyperbolic tangentactivation function with 0.004 maximum error while requiringonly 1.45 kbits BRAM memory and 21 LUTs of a Virtex-7 FPGA.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信