Raghavendra M Devadas , Vani Hiremani , Preethi , Sowmya T , Sapna R , Praveen Gujjar
{"title":"Hypercomplex neural networks: Exploring quaternion, octonion, and beyond in deep learning","authors":"Raghavendra M Devadas , Vani Hiremani , Preethi , Sowmya T , Sapna R , Praveen Gujjar","doi":"10.1016/j.mex.2025.103644","DOIUrl":null,"url":null,"abstract":"<div><div>Hypercomplex Neural Networks (HNNs) represent the next frontier in deep learning, building on the mathematical theory of quaternions, octonions, and higher-dimensional algebras to generalize conventional neural architectures. This review synthesizes cutting-edge methods with their theoretical bases, architectural advancements, and primary applications, tracing the development of hypercomplex mathematics and its implementation in computational models. We distil key advances in quaternion and octonion networks, highlighting their ability to provide compact representations and computational efficiency. Particular attention is given to the unique challenge of non-associativity in octonions—where the order in which numbers are multiplied affects the result—requiring careful design of network operations. The article also discusses training complexity, interpretability, and the lack of standardized frameworks, alongside comparative performance with real- and complex-valued networks. Future directions include scalable algorithm construction, lightweight architectures through tensor decompositions, and integration with quantum-inspired systems using higher-order algebras. By presenting a systematic synthesis of current literature and linking these advances to practical applications, this review aims to equip researchers and practitioners with a clear understanding of the strengths, limitations, and potential of HNNs for advancing multidimensional data modelling.</div></div>","PeriodicalId":18446,"journal":{"name":"MethodsX","volume":"15 ","pages":"Article 103644"},"PeriodicalIF":1.9000,"publicationDate":"2025-09-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"MethodsX","FirstCategoryId":"1085","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S2215016125004881","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MULTIDISCIPLINARY SCIENCES","Score":null,"Total":0}
引用次数: 0
Abstract
Hypercomplex Neural Networks (HNNs) represent the next frontier in deep learning, building on the mathematical theory of quaternions, octonions, and higher-dimensional algebras to generalize conventional neural architectures. This review synthesizes cutting-edge methods with their theoretical bases, architectural advancements, and primary applications, tracing the development of hypercomplex mathematics and its implementation in computational models. We distil key advances in quaternion and octonion networks, highlighting their ability to provide compact representations and computational efficiency. Particular attention is given to the unique challenge of non-associativity in octonions—where the order in which numbers are multiplied affects the result—requiring careful design of network operations. The article also discusses training complexity, interpretability, and the lack of standardized frameworks, alongside comparative performance with real- and complex-valued networks. Future directions include scalable algorithm construction, lightweight architectures through tensor decompositions, and integration with quantum-inspired systems using higher-order algebras. By presenting a systematic synthesis of current literature and linking these advances to practical applications, this review aims to equip researchers and practitioners with a clear understanding of the strengths, limitations, and potential of HNNs for advancing multidimensional data modelling.