{"title":"批量归一化非线性神经元模型的低功耗、高精度数字化设计:综合实验与FPGA评估","authors":"S. Kavitha , C. Kumar , Abdullah Alwabli","doi":"10.1016/j.asej.2025.103469","DOIUrl":null,"url":null,"abstract":"<div><div>The Batch-Normalization (BN) technique has gained significant attention in Neural Networks (NNs) due to its ability to mitigate the vanishing gradient problem, leading to slow learning rate in early epochs, complicating Activation Functions (AFs) optimization which significantly affects the NNs performance. This paper ultimately focuses on the Batch-Normalized Non-linear Neuron Models (BN-NLN) like Logistic, Softmax, LeakyReLU, Swish, TanH, ELU, SELU and APL. Simulations and FPGA implementations confirm that the proposed neurons outperform in terms of resource and interconnect utilization, delay at 10 MHz clock frequency. Notably, BNTANH stands out as a highly efficient low power neuron model. Extensive statistical analysis proves that proposed neuron models like BNLEAKYRELU, BNSWISH, BNTANH, BNELU, and BNSELU achieve impressive accuracy rates of 97 %, 98 %, 98 %, 97 %, and 98 % respectively, confirming the effectiveness of the proposed models in optimizing both power efficiency and computational accuracy.</div></div>","PeriodicalId":48648,"journal":{"name":"Ain Shams Engineering Journal","volume":"16 8","pages":"Article 103469"},"PeriodicalIF":6.0000,"publicationDate":"2025-05-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"A low-power, high accuracy digital design of batch normalized non-linear neuron models: Synthetic experiments and FPGA evaluation\",\"authors\":\"S. Kavitha , C. Kumar , Abdullah Alwabli\",\"doi\":\"10.1016/j.asej.2025.103469\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>The Batch-Normalization (BN) technique has gained significant attention in Neural Networks (NNs) due to its ability to mitigate the vanishing gradient problem, leading to slow learning rate in early epochs, complicating Activation Functions (AFs) optimization which significantly affects the NNs performance. This paper ultimately focuses on the Batch-Normalized Non-linear Neuron Models (BN-NLN) like Logistic, Softmax, LeakyReLU, Swish, TanH, ELU, SELU and APL. Simulations and FPGA implementations confirm that the proposed neurons outperform in terms of resource and interconnect utilization, delay at 10 MHz clock frequency. Notably, BNTANH stands out as a highly efficient low power neuron model. Extensive statistical analysis proves that proposed neuron models like BNLEAKYRELU, BNSWISH, BNTANH, BNELU, and BNSELU achieve impressive accuracy rates of 97 %, 98 %, 98 %, 97 %, and 98 % respectively, confirming the effectiveness of the proposed models in optimizing both power efficiency and computational accuracy.</div></div>\",\"PeriodicalId\":48648,\"journal\":{\"name\":\"Ain Shams Engineering Journal\",\"volume\":\"16 8\",\"pages\":\"Article 103469\"},\"PeriodicalIF\":6.0000,\"publicationDate\":\"2025-05-15\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Ain Shams Engineering Journal\",\"FirstCategoryId\":\"5\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S2090447925002102\",\"RegionNum\":2,\"RegionCategory\":\"工程技术\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"ENGINEERING, MULTIDISCIPLINARY\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Ain Shams Engineering Journal","FirstCategoryId":"5","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2090447925002102","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, MULTIDISCIPLINARY","Score":null,"Total":0}
A low-power, high accuracy digital design of batch normalized non-linear neuron models: Synthetic experiments and FPGA evaluation
The Batch-Normalization (BN) technique has gained significant attention in Neural Networks (NNs) due to its ability to mitigate the vanishing gradient problem, leading to slow learning rate in early epochs, complicating Activation Functions (AFs) optimization which significantly affects the NNs performance. This paper ultimately focuses on the Batch-Normalized Non-linear Neuron Models (BN-NLN) like Logistic, Softmax, LeakyReLU, Swish, TanH, ELU, SELU and APL. Simulations and FPGA implementations confirm that the proposed neurons outperform in terms of resource and interconnect utilization, delay at 10 MHz clock frequency. Notably, BNTANH stands out as a highly efficient low power neuron model. Extensive statistical analysis proves that proposed neuron models like BNLEAKYRELU, BNSWISH, BNTANH, BNELU, and BNSELU achieve impressive accuracy rates of 97 %, 98 %, 98 %, 97 %, and 98 % respectively, confirming the effectiveness of the proposed models in optimizing both power efficiency and computational accuracy.
期刊介绍:
in Shams Engineering Journal is an international journal devoted to publication of peer reviewed original high-quality research papers and review papers in both traditional topics and those of emerging science and technology. Areas of both theoretical and fundamental interest as well as those concerning industrial applications, emerging instrumental techniques and those which have some practical application to an aspect of human endeavor, such as the preservation of the environment, health, waste disposal are welcome. The overall focus is on original and rigorous scientific research results which have generic significance.
Ain Shams Engineering Journal focuses upon aspects of mechanical engineering, electrical engineering, civil engineering, chemical engineering, petroleum engineering, environmental engineering, architectural and urban planning engineering. Papers in which knowledge from other disciplines is integrated with engineering are especially welcome like nanotechnology, material sciences, and computational methods as well as applied basic sciences: engineering mathematics, physics and chemistry.