Saurabh Kumar, Siddharth Dangwal, Soumik Adhikary, D. Bhowmik
{"title":"神经网络的量子激活函数:提议与实现","authors":"Saurabh Kumar, Siddharth Dangwal, Soumik Adhikary, D. Bhowmik","doi":"10.1109/IJCNN52387.2021.9533362","DOIUrl":null,"url":null,"abstract":"A non-linear activation function is an integral component of neural network algorithms used for various tasks such as data classification and pattern recognition. In neu-romorphic/emerging-hardware-based implementations of neural network algorithms, the non-linear activation function is often implemented through dedicated analog electronics. This enables faster execution of the activation function during training and inference of neural networks compared to conventional digital implementation. Here, with a similar motivation, we propose a novel non-linear activation function that can be used in a neural network for data classification. Our activation function can be implemented by taking advantage of the inherent nonlinearity in qubit preparation and SU(2) operation in quantum mechanics. These operations are directly implementable on quantum hardware through single-qubit quantum gates as we show here. In addition, the SU(2) parameters are adjustable here making the activation function adaptable; we adjust the parameters through classical feedback like in a variational algorithm in quantum machine learning. Using our proposed quantum activation function, we report accurate classification using popular machine learning data sets like Fisher's Iris, Wisconsin's Breast Cancer (WBC), Abalone, and MNIST on three different platforms: simulations on a classical computer, simulations on a quantum simulation framework like Qiskit, and experimental implementation on quantum hardware (IBM-Q). Then we use a Bloch-sphere-based approach to intuitively explain how our proposed quantum activation function, with its adaptability, helps in data classification.","PeriodicalId":396583,"journal":{"name":"2021 International Joint Conference on Neural Networks (IJCNN)","volume":"38 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-07-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"1","resultStr":"{\"title\":\"A Quantum Activation Function for Neural Networks: Proposal and Implementation\",\"authors\":\"Saurabh Kumar, Siddharth Dangwal, Soumik Adhikary, D. Bhowmik\",\"doi\":\"10.1109/IJCNN52387.2021.9533362\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"A non-linear activation function is an integral component of neural network algorithms used for various tasks such as data classification and pattern recognition. In neu-romorphic/emerging-hardware-based implementations of neural network algorithms, the non-linear activation function is often implemented through dedicated analog electronics. This enables faster execution of the activation function during training and inference of neural networks compared to conventional digital implementation. Here, with a similar motivation, we propose a novel non-linear activation function that can be used in a neural network for data classification. Our activation function can be implemented by taking advantage of the inherent nonlinearity in qubit preparation and SU(2) operation in quantum mechanics. These operations are directly implementable on quantum hardware through single-qubit quantum gates as we show here. In addition, the SU(2) parameters are adjustable here making the activation function adaptable; we adjust the parameters through classical feedback like in a variational algorithm in quantum machine learning. Using our proposed quantum activation function, we report accurate classification using popular machine learning data sets like Fisher's Iris, Wisconsin's Breast Cancer (WBC), Abalone, and MNIST on three different platforms: simulations on a classical computer, simulations on a quantum simulation framework like Qiskit, and experimental implementation on quantum hardware (IBM-Q). Then we use a Bloch-sphere-based approach to intuitively explain how our proposed quantum activation function, with its adaptability, helps in data classification.\",\"PeriodicalId\":396583,\"journal\":{\"name\":\"2021 International Joint Conference on Neural Networks (IJCNN)\",\"volume\":\"38 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-07-18\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"1\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2021 International Joint Conference on Neural Networks (IJCNN)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IJCNN52387.2021.9533362\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2021 International Joint Conference on Neural Networks (IJCNN)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IJCNN52387.2021.9533362","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 1
摘要
非线性激活函数是神经网络算法的一个组成部分,用于各种任务,如数据分类和模式识别。在基于新形态/新兴硬件的神经网络算法实现中,非线性激活函数通常通过专用的模拟电子器件实现。与传统的数字实现相比,这使得在神经网络的训练和推理过程中更快地执行激活函数。在这里,基于类似的动机,我们提出了一种新的非线性激活函数,可以在神经网络中用于数据分类。我们的激活函数可以利用量子比特制备中固有的非线性和量子力学中的SU(2)运算来实现。这些操作可以通过单量子位量子门直接在量子硬件上实现,如图所示。此外,这里的SU(2)参数是可调的,使激活函数具有适应性;我们通过经典反馈来调整参数,就像量子机器学习中的变分算法一样。使用我们提出的量子激活函数,我们报告了在三个不同平台上使用流行的机器学习数据集(如Fisher’s Iris, Wisconsin’s Breast Cancer (WBC), Abalone和MNIST)的准确分类:在经典计算机上的模拟,在Qiskit等量子模拟框架上的模拟,以及在量子硬件(IBM-Q)上的实验实现。然后,我们使用基于bloch -sphere的方法直观地解释了我们提出的量子激活函数及其适应性如何帮助数据分类。
A Quantum Activation Function for Neural Networks: Proposal and Implementation
A non-linear activation function is an integral component of neural network algorithms used for various tasks such as data classification and pattern recognition. In neu-romorphic/emerging-hardware-based implementations of neural network algorithms, the non-linear activation function is often implemented through dedicated analog electronics. This enables faster execution of the activation function during training and inference of neural networks compared to conventional digital implementation. Here, with a similar motivation, we propose a novel non-linear activation function that can be used in a neural network for data classification. Our activation function can be implemented by taking advantage of the inherent nonlinearity in qubit preparation and SU(2) operation in quantum mechanics. These operations are directly implementable on quantum hardware through single-qubit quantum gates as we show here. In addition, the SU(2) parameters are adjustable here making the activation function adaptable; we adjust the parameters through classical feedback like in a variational algorithm in quantum machine learning. Using our proposed quantum activation function, we report accurate classification using popular machine learning data sets like Fisher's Iris, Wisconsin's Breast Cancer (WBC), Abalone, and MNIST on three different platforms: simulations on a classical computer, simulations on a quantum simulation framework like Qiskit, and experimental implementation on quantum hardware (IBM-Q). Then we use a Bloch-sphere-based approach to intuitively explain how our proposed quantum activation function, with its adaptability, helps in data classification.