VFF-Net: Evolving forward–forward algorithms into convolutional neural networks for enhanced computational insights

IF 6 1区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Gilha Lee, Jin Shin, Hyun Kim
{"title":"VFF-Net: Evolving forward–forward algorithms into convolutional neural networks for enhanced computational insights","authors":"Gilha Lee,&nbsp;Jin Shin,&nbsp;Hyun Kim","doi":"10.1016/j.neunet.2025.107697","DOIUrl":null,"url":null,"abstract":"<div><div>In recent years, significant efforts have been made to overcome the limitations inherent in the traditional back-propagation (BP) algorithm. These limitations include overfitting, vanishing/exploding gradients, slow convergence, and black-box nature. To address these limitations, alternatives to BP have been explored, the most well-known of which is the forward–forward network (FFN). We propose a visual forward–forward network (VFF-Net) that significantly improves FFNs for deeper networks, focusing on enhancing performance in convolutional neural network (CNN) training. VFF-Net utilizes a label-wise noise labeling method and cosine-similarity-based contrastive loss, which directly uses intermediate features to solve both the input information loss problem and the performance drop problem caused by the goodness function when applied to CNNs. Furthermore, VFF-Net is accompanied by layer grouping, which groups layers with the same output channel for application in well-known existing CNN-based models; this reduces the number of minima that need to be optimized and facilitates the transfer to CNN-based models by demonstrating the effects of ensemble training. VFF-Net improves the test error by up to 8.31% and 3.80% on a model consisting of four convolutional layers compared with the FFN model targeting a conventional CNN on CIFAR-10 and CIFAR-100, respectively. Furthermore, the fully connected layer-based VFF-Net achieved a test error of 1.70% on the MNIST dataset, which is better than that of the existing BP. In conclusion, the proposed VFF-Net significantly reduces the performance gap with BP by improving the FFN and shows the flexibility to be portable to existing CNN-based models.</div></div>","PeriodicalId":49763,"journal":{"name":"Neural Networks","volume":"190 ","pages":"Article 107697"},"PeriodicalIF":6.0000,"publicationDate":"2025-06-16","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Neural Networks","FirstCategoryId":"94","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S0893608025005775","RegionNum":1,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

In recent years, significant efforts have been made to overcome the limitations inherent in the traditional back-propagation (BP) algorithm. These limitations include overfitting, vanishing/exploding gradients, slow convergence, and black-box nature. To address these limitations, alternatives to BP have been explored, the most well-known of which is the forward–forward network (FFN). We propose a visual forward–forward network (VFF-Net) that significantly improves FFNs for deeper networks, focusing on enhancing performance in convolutional neural network (CNN) training. VFF-Net utilizes a label-wise noise labeling method and cosine-similarity-based contrastive loss, which directly uses intermediate features to solve both the input information loss problem and the performance drop problem caused by the goodness function when applied to CNNs. Furthermore, VFF-Net is accompanied by layer grouping, which groups layers with the same output channel for application in well-known existing CNN-based models; this reduces the number of minima that need to be optimized and facilitates the transfer to CNN-based models by demonstrating the effects of ensemble training. VFF-Net improves the test error by up to 8.31% and 3.80% on a model consisting of four convolutional layers compared with the FFN model targeting a conventional CNN on CIFAR-10 and CIFAR-100, respectively. Furthermore, the fully connected layer-based VFF-Net achieved a test error of 1.70% on the MNIST dataset, which is better than that of the existing BP. In conclusion, the proposed VFF-Net significantly reduces the performance gap with BP by improving the FFN and shows the flexibility to be portable to existing CNN-based models.
VFF-Net:将前向算法进化为卷积神经网络,以增强计算洞察力
近年来,人们为克服传统反向传播(BP)算法固有的局限性做出了大量努力。这些限制包括过拟合、消失/爆炸梯度、缓慢收敛和黑盒性质。为了解决这些限制,人们探索了BP的替代方案,其中最著名的是前向网络(FFN)。我们提出了一种视觉前向网络(VFF-Net),它显著改善了深层网络的ffn,重点是提高卷积神经网络(CNN)训练的性能。VFF-Net采用基于标签的噪声标注方法和基于余弦相似度的对比损失,直接利用中间特征解决了输入信息丢失问题和应用于cnn时优度函数导致的性能下降问题。此外,VFF-Net还伴随着层分组,将具有相同输出通道的层分组,用于已知的现有基于cnn的模型;这减少了需要优化的最小值的数量,并且通过展示集成训练的效果,便于转移到基于cnn的模型。与CIFAR-10和CIFAR-100上针对传统CNN的FFN模型相比,VFF-Net在由4个卷积层组成的模型上的测试误差分别提高了8.31%和3.80%。此外,基于全连接层的VFF-Net在MNIST数据集上的测试误差为1.70%,优于现有BP。综上所述,所提出的VFF-Net通过改进FFN显著减小了与BP的性能差距,并显示出可移植到现有基于cnn的模型的灵活性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Neural Networks
Neural Networks 工程技术-计算机:人工智能
CiteScore
13.90
自引率
7.70%
发文量
425
审稿时长
67 days
期刊介绍: Neural Networks is a platform that aims to foster an international community of scholars and practitioners interested in neural networks, deep learning, and other approaches to artificial intelligence and machine learning. Our journal invites submissions covering various aspects of neural networks research, from computational neuroscience and cognitive modeling to mathematical analyses and engineering applications. By providing a forum for interdisciplinary discussions between biology and technology, we aim to encourage the development of biologically-inspired artificial intelligence.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信