{"title":"用于多类图像分类的多模态卷积和卷积神经网络","authors":"Yuri G. Gordienko, Yevhenii Trochun, S. Stirenko","doi":"10.3390/bdcc8070075","DOIUrl":null,"url":null,"abstract":"By utilizing hybrid quantum–classical neural networks (HNNs), this research aims to enhance the efficiency of image classification tasks. HNNs allow us to utilize quantum computing to solve machine learning problems, which can be highly power-efficient and provide significant computation speedup compared to classical operations. This is particularly relevant in sustainable applications where reducing computational resources and energy consumption is crucial. This study explores the feasibility of a novel architecture by leveraging quantum devices as the first layer of the neural network, which proved to be useful for scaling HNNs’ training process. Understanding the role of quanvolutional operations and how they interact with classical neural networks can lead to optimized model architectures that are more efficient and effective for image classification tasks. This research investigates the performance of HNNs across different datasets, including CIFAR100 and Satellite Images of Hurricane Damage by evaluating the performance of HNNs on these datasets in comparison with the performance of reference classical models. By evaluating the scalability of HNNs on diverse datasets, the study provides insights into their applicability across various real-world scenarios, which is essential for building sustainable machine learning solutions that can adapt to different environments. Leveraging transfer learning techniques with pre-trained models such as ResNet, EfficientNet, and VGG16 demonstrates the potential for HNNs to benefit from existing knowledge in classical neural networks. This approach can significantly reduce the computational cost of training HNNs from scratch while still achieving competitive performance. The feasibility study conducted in this research assesses the practicality and viability of deploying HNNs for real-world image classification tasks. By comparing the performance of HNNs with classical reference models like ResNet, EfficientNet, and VGG-16, this study provides evidence of the potential advantages of HNNs in certain scenarios. Overall, the findings of this research contribute to advancing sustainable applications of machine learning by proposing novel techniques, optimizing model architectures, and demonstrating the feasibility of adopting HNNs for real-world image classification problems. These insights can inform the development of more efficient and environmentally friendly machine learning solutions.","PeriodicalId":505155,"journal":{"name":"Big Data and Cognitive Computing","volume":null,"pages":null},"PeriodicalIF":0.0000,"publicationDate":"2024-07-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Multimodal Quanvolutional and Convolutional Neural Networks for Multi-Class Image Classification\",\"authors\":\"Yuri G. Gordienko, Yevhenii Trochun, S. Stirenko\",\"doi\":\"10.3390/bdcc8070075\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"By utilizing hybrid quantum–classical neural networks (HNNs), this research aims to enhance the efficiency of image classification tasks. HNNs allow us to utilize quantum computing to solve machine learning problems, which can be highly power-efficient and provide significant computation speedup compared to classical operations. This is particularly relevant in sustainable applications where reducing computational resources and energy consumption is crucial. This study explores the feasibility of a novel architecture by leveraging quantum devices as the first layer of the neural network, which proved to be useful for scaling HNNs’ training process. Understanding the role of quanvolutional operations and how they interact with classical neural networks can lead to optimized model architectures that are more efficient and effective for image classification tasks. This research investigates the performance of HNNs across different datasets, including CIFAR100 and Satellite Images of Hurricane Damage by evaluating the performance of HNNs on these datasets in comparison with the performance of reference classical models. By evaluating the scalability of HNNs on diverse datasets, the study provides insights into their applicability across various real-world scenarios, which is essential for building sustainable machine learning solutions that can adapt to different environments. Leveraging transfer learning techniques with pre-trained models such as ResNet, EfficientNet, and VGG16 demonstrates the potential for HNNs to benefit from existing knowledge in classical neural networks. This approach can significantly reduce the computational cost of training HNNs from scratch while still achieving competitive performance. The feasibility study conducted in this research assesses the practicality and viability of deploying HNNs for real-world image classification tasks. By comparing the performance of HNNs with classical reference models like ResNet, EfficientNet, and VGG-16, this study provides evidence of the potential advantages of HNNs in certain scenarios. Overall, the findings of this research contribute to advancing sustainable applications of machine learning by proposing novel techniques, optimizing model architectures, and demonstrating the feasibility of adopting HNNs for real-world image classification problems. These insights can inform the development of more efficient and environmentally friendly machine learning solutions.\",\"PeriodicalId\":505155,\"journal\":{\"name\":\"Big Data and Cognitive Computing\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2024-07-08\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Big Data and Cognitive Computing\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.3390/bdcc8070075\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Big Data and Cognitive Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.3390/bdcc8070075","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Multimodal Quanvolutional and Convolutional Neural Networks for Multi-Class Image Classification
By utilizing hybrid quantum–classical neural networks (HNNs), this research aims to enhance the efficiency of image classification tasks. HNNs allow us to utilize quantum computing to solve machine learning problems, which can be highly power-efficient and provide significant computation speedup compared to classical operations. This is particularly relevant in sustainable applications where reducing computational resources and energy consumption is crucial. This study explores the feasibility of a novel architecture by leveraging quantum devices as the first layer of the neural network, which proved to be useful for scaling HNNs’ training process. Understanding the role of quanvolutional operations and how they interact with classical neural networks can lead to optimized model architectures that are more efficient and effective for image classification tasks. This research investigates the performance of HNNs across different datasets, including CIFAR100 and Satellite Images of Hurricane Damage by evaluating the performance of HNNs on these datasets in comparison with the performance of reference classical models. By evaluating the scalability of HNNs on diverse datasets, the study provides insights into their applicability across various real-world scenarios, which is essential for building sustainable machine learning solutions that can adapt to different environments. Leveraging transfer learning techniques with pre-trained models such as ResNet, EfficientNet, and VGG16 demonstrates the potential for HNNs to benefit from existing knowledge in classical neural networks. This approach can significantly reduce the computational cost of training HNNs from scratch while still achieving competitive performance. The feasibility study conducted in this research assesses the practicality and viability of deploying HNNs for real-world image classification tasks. By comparing the performance of HNNs with classical reference models like ResNet, EfficientNet, and VGG-16, this study provides evidence of the potential advantages of HNNs in certain scenarios. Overall, the findings of this research contribute to advancing sustainable applications of machine learning by proposing novel techniques, optimizing model architectures, and demonstrating the feasibility of adopting HNNs for real-world image classification problems. These insights can inform the development of more efficient and environmentally friendly machine learning solutions.