{"title":"用逼近函数实现神经网络","authors":"M. Hnatiuc, G. Lamarque","doi":"10.1109/SCS.2003.1227112","DOIUrl":null,"url":null,"abstract":"The purpose of this work is to stimulate a neural network with non-linear activation functions. The non-linear functions are simulated in Microsoft Visual Studio C++ 6.0 to observe the precision and to implement on the programmable logic devices. This network is realized to accept very small input values. The multiplication between input values and weight values is realized with the add-logarithm and exponential functions. One approximates all the non-linear functions with linear functions using shift-add blocks.","PeriodicalId":375963,"journal":{"name":"Signals, Circuits and Systems, 2003. SCS 2003. International Symposium on","volume":"106 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2003-07-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"Implementation of neural network with approximations functions\",\"authors\":\"M. Hnatiuc, G. Lamarque\",\"doi\":\"10.1109/SCS.2003.1227112\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The purpose of this work is to stimulate a neural network with non-linear activation functions. The non-linear functions are simulated in Microsoft Visual Studio C++ 6.0 to observe the precision and to implement on the programmable logic devices. This network is realized to accept very small input values. The multiplication between input values and weight values is realized with the add-logarithm and exponential functions. One approximates all the non-linear functions with linear functions using shift-add blocks.\",\"PeriodicalId\":375963,\"journal\":{\"name\":\"Signals, Circuits and Systems, 2003. SCS 2003. International Symposium on\",\"volume\":\"106 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2003-07-10\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Signals, Circuits and Systems, 2003. SCS 2003. International Symposium on\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/SCS.2003.1227112\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Signals, Circuits and Systems, 2003. SCS 2003. International Symposium on","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/SCS.2003.1227112","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
这项工作的目的是用非线性激活函数来刺激神经网络。在Microsoft Visual Studio c++ 6.0中对非线性函数进行仿真,观察其精度并在可编程逻辑器件上实现。该网络可以接受非常小的输入值。输入值与权值之间的乘法是通过加对数和指数函数实现的。一个近似所有的非线性函数与线性函数使用移位加块。
Implementation of neural network with approximations functions
The purpose of this work is to stimulate a neural network with non-linear activation functions. The non-linear functions are simulated in Microsoft Visual Studio C++ 6.0 to observe the precision and to implement on the programmable logic devices. This network is realized to accept very small input values. The multiplication between input values and weight values is realized with the add-logarithm and exponential functions. One approximates all the non-linear functions with linear functions using shift-add blocks.