Understanding Vector-Valued Neural Networks and Their Relationship With Real and Hypercomplex-Valued Neural Networks: Incorporating intercorrelation between features into neural networks [Hypercomplex Signal and Image Processing]

IF 9.4 1区 工程技术 Q1 ENGINEERING, ELECTRICAL & ELECTRONIC
Marcos Eduardo Valle
{"title":"Understanding Vector-Valued Neural Networks and Their Relationship With Real and Hypercomplex-Valued Neural Networks: Incorporating intercorrelation between features into neural networks [Hypercomplex Signal and Image Processing]","authors":"Marcos Eduardo Valle","doi":"10.1109/MSP.2024.3401621","DOIUrl":null,"url":null,"abstract":"Despite the many successful applications of deep learning models for multidimensional signal and image processing, most traditional neural networks process data represented by (multidimensional) arrays of real numbers. The intercorrelation between feature channels is usually expected to be learned from the training data, requiring numerous parameters and careful training. In contrast, vector-valued neural networks (referred to as \n<italic>V-nets</i>\n) are conceived to process arrays of vectors and naturally consider the intercorrelation between feature channels. Consequently, they usually have fewer parameters and often undergo more robust training than traditional neural networks. This article aims to present a broad framework for V-nets. In this context, hypercomplex-valued neural networks are regarded as vector-valued models with additional algebraic properties. Furthermore, this article explains the relationship between vector-valued and traditional neural networks. To be precise, a V-net can be obtained by placing restrictions on a real-valued model to consider the intercorrelation between feature channels. Finally, I show how V-nets, including hypercomplex-valued neural networks, can be implemented in current deep learning libraries as real-valued networks.","PeriodicalId":13246,"journal":{"name":"IEEE Signal Processing Magazine","volume":"41 3","pages":"49-58"},"PeriodicalIF":9.4000,"publicationDate":"2024-08-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"IEEE Signal Processing Magazine","FirstCategoryId":"5","ListUrlMain":"https://ieeexplore.ieee.org/document/10640345/","RegionNum":1,"RegionCategory":"工程技术","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"ENGINEERING, ELECTRICAL & ELECTRONIC","Score":null,"Total":0}
引用次数: 0

Abstract

Despite the many successful applications of deep learning models for multidimensional signal and image processing, most traditional neural networks process data represented by (multidimensional) arrays of real numbers. The intercorrelation between feature channels is usually expected to be learned from the training data, requiring numerous parameters and careful training. In contrast, vector-valued neural networks (referred to as V-nets ) are conceived to process arrays of vectors and naturally consider the intercorrelation between feature channels. Consequently, they usually have fewer parameters and often undergo more robust training than traditional neural networks. This article aims to present a broad framework for V-nets. In this context, hypercomplex-valued neural networks are regarded as vector-valued models with additional algebraic properties. Furthermore, this article explains the relationship between vector-valued and traditional neural networks. To be precise, a V-net can be obtained by placing restrictions on a real-valued model to consider the intercorrelation between feature channels. Finally, I show how V-nets, including hypercomplex-valued neural networks, can be implemented in current deep learning libraries as real-valued networks.
理解矢量值神经网络及其与实值和超复值神经网络的关系:将特征之间的相互关系纳入神经网络[超复杂信号和图像处理]
尽管深度学习模型在多维信号和图像处理方面有许多成功应用,但大多数传统神经网络处理的数据都是由实数(多维)阵列表示的。特征通道之间的相互关系通常需要从训练数据中学习,因此需要大量参数和仔细的训练。与此相反,向量值神经网络(简称 V-网络)的设计理念是处理向量数组,并自然地考虑特征通道之间的相互关系。因此,与传统的神经网络相比,它们的参数通常较少,而且往往需要经过更稳健的训练。本文旨在为 V 型网络提供一个广泛的框架。在此背景下,超复值神经网络被视为具有额外代数特性的向量值模型。此外,本文还解释了向量值神经网络与传统神经网络之间的关系。准确地说,V-网络可以通过对实值模型施加限制来获得,以考虑特征通道之间的相互关系。最后,我展示了 V 型网络(包括超复值神经网络)如何在当前的深度学习库中作为实值网络来实现。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
IEEE Signal Processing Magazine
IEEE Signal Processing Magazine 工程技术-工程:电子与电气
CiteScore
27.20
自引率
0.70%
发文量
123
审稿时长
6-12 weeks
期刊介绍: EEE Signal Processing Magazine is a publication that focuses on signal processing research and applications. It publishes tutorial-style articles, columns, and forums that cover a wide range of topics related to signal processing. The magazine aims to provide the research, educational, and professional communities with the latest technical developments, issues, and events in the field. It serves as the main communication platform for the society, addressing important matters that concern all members.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信