A Comprehensive Analysis of Quaternion Deep Neural Networks: Architectures, Applications, Challenges, and Future Scope

IF 12.1 2区 工程技术 Q1 COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS
Sukhendra Singh, Sushil Kumar, B. K. Tripathi
{"title":"A Comprehensive Analysis of Quaternion Deep Neural Networks: Architectures, Applications, Challenges, and Future Scope","authors":"Sukhendra Singh,&nbsp;Sushil Kumar,&nbsp;B. K. Tripathi","doi":"10.1007/s11831-024-10216-1","DOIUrl":null,"url":null,"abstract":"<div><p>Quaternions are extensively used in several fields including physics, applied mathematics, computer graphics, and control systems because of their notable and unique characteristics. Embedding quaternions into deep neural networks has attracted significant attention to neurocomputing researchers in recent years. Quaternion’s algebra helps to reconstruct neural networks in the quaternionic domain. This paper comprehensively reviewed and analyzed the recent advancements in quaternion deep neural networks (QDNNs) and their practical applications. Several architectures integrating quaternions in deep neural networks such as quaternion convolutional neural networks, quaternion recurrent neural networks, quaternion self-attention networks, hypercomplex convolutional neural networks, quaternion long-short term memory networks, quaternion residual networks, and quaternion variational autoencoders are thoroughly examined and reviewed with applications. It is observed that they have outperformed conventional real-valued neural networks. This study also discusses the main discoveries and possible advanced mechanisms of QDNN for future research. The open challenges and future scopes of QDNNs are also addressed, which provides the right direction of work in this field. This review may help researchers interested in architectural advancements and their practical applications.</p></div>","PeriodicalId":55473,"journal":{"name":"Archives of Computational Methods in Engineering","volume":"32 4","pages":"2607 - 2634"},"PeriodicalIF":12.1000,"publicationDate":"2024-12-28","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Archives of Computational Methods in Engineering","FirstCategoryId":"5","ListUrlMain":"https://link.springer.com/article/10.1007/s11831-024-10216-1","RegionNum":2,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0

Abstract

Quaternions are extensively used in several fields including physics, applied mathematics, computer graphics, and control systems because of their notable and unique characteristics. Embedding quaternions into deep neural networks has attracted significant attention to neurocomputing researchers in recent years. Quaternion’s algebra helps to reconstruct neural networks in the quaternionic domain. This paper comprehensively reviewed and analyzed the recent advancements in quaternion deep neural networks (QDNNs) and their practical applications. Several architectures integrating quaternions in deep neural networks such as quaternion convolutional neural networks, quaternion recurrent neural networks, quaternion self-attention networks, hypercomplex convolutional neural networks, quaternion long-short term memory networks, quaternion residual networks, and quaternion variational autoencoders are thoroughly examined and reviewed with applications. It is observed that they have outperformed conventional real-valued neural networks. This study also discusses the main discoveries and possible advanced mechanisms of QDNN for future research. The open challenges and future scopes of QDNNs are also addressed, which provides the right direction of work in this field. This review may help researchers interested in architectural advancements and their practical applications.

Abstract Image

四元数深度神经网络的综合分析:架构、应用、挑战和未来范围
四元数由于其显著而独特的特性被广泛应用于物理、应用数学、计算机图形学和控制系统等多个领域。近年来,将四元数嵌入深度神经网络引起了神经计算研究者的极大关注。四元数代数有助于在四元数域重构神经网络。本文对四元数深度神经网络(qdnn)及其实际应用的最新进展进行了综述和分析。在深度神经网络中集成四元数的几种体系结构,如四元数卷积神经网络、四元数循环神经网络、四元数自注意网络、超复杂卷积神经网络、四元数长短期记忆网络、四元数残差网络和四元数变分自编码器,被彻底地检查和审查与应用。观察到它们优于传统的实值神经网络。本研究还讨论了QDNN的主要发现和可能的先进机制,以供未来研究。本文还讨论了qdnn的开放挑战和未来范围,为该领域的工作提供了正确的方向。这篇综述可能有助于对建筑进步及其实际应用感兴趣的研究人员。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
CiteScore
19.80
自引率
4.10%
发文量
153
审稿时长
>12 weeks
期刊介绍: Archives of Computational Methods in Engineering Aim and Scope: Archives of Computational Methods in Engineering serves as an active forum for disseminating research and advanced practices in computational engineering, particularly focusing on mechanics and related fields. The journal emphasizes extended state-of-the-art reviews in selected areas, a unique feature of its publication. Review Format: Reviews published in the journal offer: A survey of current literature Critical exposition of topics in their full complexity By organizing the information in this manner, readers can quickly grasp the focus, coverage, and unique features of the Archives of Computational Methods in Engineering.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信