Jie Ran , Yonghui Zhou , Thabet Abdeljawad , Hao Pu
{"title":"Discrete fractional neural networks within the framework of octonions: A preliminary exploration","authors":"Jie Ran , Yonghui Zhou , Thabet Abdeljawad , Hao Pu","doi":"10.1016/j.jocs.2025.102586","DOIUrl":null,"url":null,"abstract":"<div><div>Conventional neural networks constructed on real or complex domains have limitations in capturing multi-dimensional data with memory effects. This work is a preliminary exploration of discrete fractional neural network modeling within the framework of octonions. Initially, by introducing the discrete fractional Caputo difference operator into the octonion domain, we establish a novel system of discrete fractional delayed octonion-valued neural networks (DFDOVNNs). The new system provides a theoretical support for developing neural network algorithms that are useful for solving complex, multi-dimensional problems with memory effects in the real world. We then use the Cayley–Dickson technique to divide the system into four discrete fractional complex-valued neural networks to deal with the non-commutative and non-associative properties of the hyper-complex domain. Next, we establish the existence and uniqueness of the equilibrium point to the system based on the homeomorphism theory. Furthermore, by employing the Lyapunov theory, we establish some straightforward and verifiable linear matrix inequality (LMI) criteria to ensure global Mittag-Leffler stability of the system. In addition, an effective feedback controller is developed to achieve the system’s drive-response synchronization in the Mittag-Leffler sense. Finally, two numerical examples support the theoretical analysis. This research introduces a novel direction in neural network studies that promises to significantly advance the fields of signal processing, control systems, and artificial intelligence.</div></div>","PeriodicalId":48907,"journal":{"name":"Journal of Computational Science","volume":"87 ","pages":"Article 102586"},"PeriodicalIF":3.1000,"publicationDate":"2025-04-05","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Computational Science","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1877750325000638","RegionNum":3,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, INTERDISCIPLINARY APPLICATIONS","Score":null,"Total":0}
引用次数: 0
Abstract
Conventional neural networks constructed on real or complex domains have limitations in capturing multi-dimensional data with memory effects. This work is a preliminary exploration of discrete fractional neural network modeling within the framework of octonions. Initially, by introducing the discrete fractional Caputo difference operator into the octonion domain, we establish a novel system of discrete fractional delayed octonion-valued neural networks (DFDOVNNs). The new system provides a theoretical support for developing neural network algorithms that are useful for solving complex, multi-dimensional problems with memory effects in the real world. We then use the Cayley–Dickson technique to divide the system into four discrete fractional complex-valued neural networks to deal with the non-commutative and non-associative properties of the hyper-complex domain. Next, we establish the existence and uniqueness of the equilibrium point to the system based on the homeomorphism theory. Furthermore, by employing the Lyapunov theory, we establish some straightforward and verifiable linear matrix inequality (LMI) criteria to ensure global Mittag-Leffler stability of the system. In addition, an effective feedback controller is developed to achieve the system’s drive-response synchronization in the Mittag-Leffler sense. Finally, two numerical examples support the theoretical analysis. This research introduces a novel direction in neural network studies that promises to significantly advance the fields of signal processing, control systems, and artificial intelligence.
期刊介绍:
Computational Science is a rapidly growing multi- and interdisciplinary field that uses advanced computing and data analysis to understand and solve complex problems. It has reached a level of predictive capability that now firmly complements the traditional pillars of experimentation and theory.
The recent advances in experimental techniques such as detectors, on-line sensor networks and high-resolution imaging techniques, have opened up new windows into physical and biological processes at many levels of detail. The resulting data explosion allows for detailed data driven modeling and simulation.
This new discipline in science combines computational thinking, modern computational methods, devices and collateral technologies to address problems far beyond the scope of traditional numerical methods.
Computational science typically unifies three distinct elements:
• Modeling, Algorithms and Simulations (e.g. numerical and non-numerical, discrete and continuous);
• Software developed to solve science (e.g., biological, physical, and social), engineering, medicine, and humanities problems;
• Computer and information science that develops and optimizes the advanced system hardware, software, networking, and data management components (e.g. problem solving environments).