A Quantum Activation Function for Neural Networks: Proposal and Implementation

Saurabh Kumar, Siddharth Dangwal, Soumik Adhikary, D. Bhowmik
{"title":"A Quantum Activation Function for Neural Networks: Proposal and Implementation","authors":"Saurabh Kumar, Siddharth Dangwal, Soumik Adhikary, D. Bhowmik","doi":"10.1109/IJCNN52387.2021.9533362","DOIUrl":null,"url":null,"abstract":"A non-linear activation function is an integral component of neural network algorithms used for various tasks such as data classification and pattern recognition. In neu-romorphic/emerging-hardware-based implementations of neural network algorithms, the non-linear activation function is often implemented through dedicated analog electronics. This enables faster execution of the activation function during training and inference of neural networks compared to conventional digital implementation. Here, with a similar motivation, we propose a novel non-linear activation function that can be used in a neural network for data classification. Our activation function can be implemented by taking advantage of the inherent nonlinearity in qubit preparation and SU(2) operation in quantum mechanics. These operations are directly implementable on quantum hardware through single-qubit quantum gates as we show here. In addition, the SU(2) parameters are adjustable here making the activation function adaptable; we adjust the parameters through classical feedback like in a variational algorithm in quantum machine learning. Using our proposed quantum activation function, we report accurate classification using popular machine learning data sets like Fisher's Iris, Wisconsin's Breast Cancer (WBC), Abalone, and MNIST on three different platforms: simulations on a classical computer, simulations on a quantum simulation framework like Qiskit, and experimental implementation on quantum hardware (IBM-Q). Then we use a Bloch-sphere-based approach to intuitively explain how our proposed quantum activation function, with its adaptability, helps in data classification.","PeriodicalId":396583,"journal":{"name":"2021 International Joint Conference on Neural Networks (IJCNN)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN52387.2021.9533362","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1

Abstract

A non-linear activation function is an integral component of neural network algorithms used for various tasks such as data classification and pattern recognition. In neu-romorphic/emerging-hardware-based implementations of neural network algorithms, the non-linear activation function is often implemented through dedicated analog electronics. This enables faster execution of the activation function during training and inference of neural networks compared to conventional digital implementation. Here, with a similar motivation, we propose a novel non-linear activation function that can be used in a neural network for data classification. Our activation function can be implemented by taking advantage of the inherent nonlinearity in qubit preparation and SU(2) operation in quantum mechanics. These operations are directly implementable on quantum hardware through single-qubit quantum gates as we show here. In addition, the SU(2) parameters are adjustable here making the activation function adaptable; we adjust the parameters through classical feedback like in a variational algorithm in quantum machine learning. Using our proposed quantum activation function, we report accurate classification using popular machine learning data sets like Fisher's Iris, Wisconsin's Breast Cancer (WBC), Abalone, and MNIST on three different platforms: simulations on a classical computer, simulations on a quantum simulation framework like Qiskit, and experimental implementation on quantum hardware (IBM-Q). Then we use a Bloch-sphere-based approach to intuitively explain how our proposed quantum activation function, with its adaptability, helps in data classification.
神经网络的量子激活函数:提议与实现
非线性激活函数是神经网络算法的一个组成部分,用于各种任务,如数据分类和模式识别。在基于新形态/新兴硬件的神经网络算法实现中,非线性激活函数通常通过专用的模拟电子器件实现。与传统的数字实现相比,这使得在神经网络的训练和推理过程中更快地执行激活函数。在这里,基于类似的动机,我们提出了一种新的非线性激活函数,可以在神经网络中用于数据分类。我们的激活函数可以利用量子比特制备中固有的非线性和量子力学中的SU(2)运算来实现。这些操作可以通过单量子位量子门直接在量子硬件上实现,如图所示。此外,这里的SU(2)参数是可调的,使激活函数具有适应性;我们通过经典反馈来调整参数,就像量子机器学习中的变分算法一样。使用我们提出的量子激活函数,我们报告了在三个不同平台上使用流行的机器学习数据集(如Fisher’s Iris, Wisconsin’s Breast Cancer (WBC), Abalone和MNIST)的准确分类:在经典计算机上的模拟,在Qiskit等量子模拟框架上的模拟,以及在量子硬件(IBM-Q)上的实验实现。然后,我们使用基于bloch -sphere的方法直观地解释了我们提出的量子激活函数及其适应性如何帮助数据分类。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信