Yujie Wu, Siyuan Xu, Jibin Wu, Lei Deng, Mingkun Xu, Qinghao Wen, Guoqi Li
{"title":"Distance-Forward Learning: Enhancing the Forward-Forward Algorithm Towards High-Performance On-Chip Learning","authors":"Yujie Wu, Siyuan Xu, Jibin Wu, Lei Deng, Mingkun Xu, Qinghao Wen, Guoqi Li","doi":"arxiv-2408.14925","DOIUrl":null,"url":null,"abstract":"The Forward-Forward (FF) algorithm was recently proposed as a local learning\nmethod to address the limitations of backpropagation (BP), offering biological\nplausibility along with memory-efficient and highly parallelized computational\nbenefits. However, it suffers from suboptimal performance and poor\ngeneralization, largely due to inadequate theoretical support and a lack of\neffective learning strategies. In this work, we reformulate FF using distance\nmetric learning and propose a distance-forward algorithm (DF) to improve FF\nperformance in supervised vision tasks while preserving its local computational\nproperties, making it competitive for efficient on-chip learning. To achieve\nthis, we reinterpret FF through the lens of centroid-based metric learning and\ndevelop a goodness-based N-pair margin loss to facilitate the learning of\ndiscriminative features. Furthermore, we integrate layer-collaboration local\nupdate strategies to reduce information loss caused by greedy local parameter\nupdates. Our method surpasses existing FF models and other advanced local\nlearning approaches, with accuracies of 99.7\\% on MNIST, 88.2\\% on CIFAR-10,\n59\\% on CIFAR-100, 95.9\\% on SVHN, and 82.5\\% on ImageNette, respectively.\nMoreover, it achieves comparable performance with less than 40\\% memory cost\ncompared to BP training, while exhibiting stronger robustness to multiple types\nof hardware-related noise, demonstrating its potential for online learning and\nenergy-efficient computation on neuromorphic chips.","PeriodicalId":501347,"journal":{"name":"arXiv - CS - Neural and Evolutionary Computing","volume":"24 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-08-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Neural and Evolutionary Computing","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2408.14925","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
The Forward-Forward (FF) algorithm was recently proposed as a local learning
method to address the limitations of backpropagation (BP), offering biological
plausibility along with memory-efficient and highly parallelized computational
benefits. However, it suffers from suboptimal performance and poor
generalization, largely due to inadequate theoretical support and a lack of
effective learning strategies. In this work, we reformulate FF using distance
metric learning and propose a distance-forward algorithm (DF) to improve FF
performance in supervised vision tasks while preserving its local computational
properties, making it competitive for efficient on-chip learning. To achieve
this, we reinterpret FF through the lens of centroid-based metric learning and
develop a goodness-based N-pair margin loss to facilitate the learning of
discriminative features. Furthermore, we integrate layer-collaboration local
update strategies to reduce information loss caused by greedy local parameter
updates. Our method surpasses existing FF models and other advanced local
learning approaches, with accuracies of 99.7\% on MNIST, 88.2\% on CIFAR-10,
59\% on CIFAR-100, 95.9\% on SVHN, and 82.5\% on ImageNette, respectively.
Moreover, it achieves comparable performance with less than 40\% memory cost
compared to BP training, while exhibiting stronger robustness to multiple types
of hardware-related noise, demonstrating its potential for online learning and
energy-efficient computation on neuromorphic chips.