Foxtsage vs. Adam:优化的革命还是进化?

IF 2.1 3区 心理学 Q3 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Sirwan Abdolwahed Aula , Tarik Ahmed Rashid
{"title":"Foxtsage vs. Adam:优化的革命还是进化?","authors":"Sirwan Abdolwahed Aula ,&nbsp;Tarik Ahmed Rashid","doi":"10.1016/j.cogsys.2025.101373","DOIUrl":null,"url":null,"abstract":"<div><div>Optimisation techniques are crucial in neural network training, influencing predictive performance, convergence efficiency, and computational feasibility. Traditional Optimisers such as Adam offer adaptive learning rates but struggle with convergence stability and hyperparameter sensitivity, whereas SGD provides stability but lacks adaptiveness. We propose Foxtsage, a novel hybrid optimisation algorithm that integrates the FOX-TSA (for global search and exploration) with SGD (for fine-tuned local exploitation) to address these limitations. The proposed Foxtsage Optimiser is benchmarked against the widely used Adam Optimiser across multiple standard datasets, including MNIST, IMDB, and CIFAR-10. Performance is evaluated based on training loss, accuracy, precision, recall, F1-score, and computational time. The study further explores computational complexity and the trade-off between optimisation performance and efficiency. Experimental findings demonstrate that Foxtsage achieves a 42.03% reduction in mean loss (Foxtsage: 9.508, Adam: 16.402) and a 42.19% decrease in loss standard deviation (Foxtsage: 20.86, Adam: 36.085), indicating greater consistency and robustness in optimisation. Additionally, modest improvements are observed in accuracy (0.78%), precision (0.91%), recall (1.02%), and F1-score (0.89%), showcasing better generalization capability. However, these gains come at a significant computational cost, with a 330.87% increase in time mean (Foxtsage: 39.541 sec, Adam: 9.177 sec), raising concerns about practical feasibility in time-sensitive applications. By effectively combining FOX-TSA’s global search power with SGD’s adaptive stability, Foxtsage provides a promising alternative for neural network training. While it enhances performance and robustness, its increased computational overhead presents a critical trade-off. Future work will focus on reducing computational complexity, improving scalability, and exploring its applicability in real-world deep-learning tasks.</div></div>","PeriodicalId":55242,"journal":{"name":"Cognitive Systems Research","volume":"92 ","pages":"Article 101373"},"PeriodicalIF":2.1000,"publicationDate":"2025-05-31","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Foxtsage vs. Adam: Revolution or evolution in optimization?\",\"authors\":\"Sirwan Abdolwahed Aula ,&nbsp;Tarik Ahmed Rashid\",\"doi\":\"10.1016/j.cogsys.2025.101373\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><div>Optimisation techniques are crucial in neural network training, influencing predictive performance, convergence efficiency, and computational feasibility. Traditional Optimisers such as Adam offer adaptive learning rates but struggle with convergence stability and hyperparameter sensitivity, whereas SGD provides stability but lacks adaptiveness. We propose Foxtsage, a novel hybrid optimisation algorithm that integrates the FOX-TSA (for global search and exploration) with SGD (for fine-tuned local exploitation) to address these limitations. The proposed Foxtsage Optimiser is benchmarked against the widely used Adam Optimiser across multiple standard datasets, including MNIST, IMDB, and CIFAR-10. Performance is evaluated based on training loss, accuracy, precision, recall, F1-score, and computational time. The study further explores computational complexity and the trade-off between optimisation performance and efficiency. Experimental findings demonstrate that Foxtsage achieves a 42.03% reduction in mean loss (Foxtsage: 9.508, Adam: 16.402) and a 42.19% decrease in loss standard deviation (Foxtsage: 20.86, Adam: 36.085), indicating greater consistency and robustness in optimisation. Additionally, modest improvements are observed in accuracy (0.78%), precision (0.91%), recall (1.02%), and F1-score (0.89%), showcasing better generalization capability. However, these gains come at a significant computational cost, with a 330.87% increase in time mean (Foxtsage: 39.541 sec, Adam: 9.177 sec), raising concerns about practical feasibility in time-sensitive applications. By effectively combining FOX-TSA’s global search power with SGD’s adaptive stability, Foxtsage provides a promising alternative for neural network training. While it enhances performance and robustness, its increased computational overhead presents a critical trade-off. Future work will focus on reducing computational complexity, improving scalability, and exploring its applicability in real-world deep-learning tasks.</div></div>\",\"PeriodicalId\":55242,\"journal\":{\"name\":\"Cognitive Systems Research\",\"volume\":\"92 \",\"pages\":\"Article 101373\"},\"PeriodicalIF\":2.1000,\"publicationDate\":\"2025-05-31\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Cognitive Systems Research\",\"FirstCategoryId\":\"102\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1389041725000531\",\"RegionNum\":3,\"RegionCategory\":\"心理学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q3\",\"JCRName\":\"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Cognitive Systems Research","FirstCategoryId":"102","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1389041725000531","RegionNum":3,"RegionCategory":"心理学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q3","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

摘要

优化技术在神经网络训练中至关重要,影响着预测性能、收敛效率和计算可行性。传统的优化器(如Adam)提供自适应学习率,但与收敛稳定性和超参数敏感性作斗争,而SGD提供稳定性,但缺乏适应性。我们提出Foxtsage,一种新的混合优化算法,将FOX-TSA(用于全局搜索和探索)与SGD(用于微调本地开发)集成在一起,以解决这些限制。提出的Foxtsage optimizer在多个标准数据集(包括MNIST、IMDB和CIFAR-10)上对广泛使用的Adam optimizer进行了基准测试。性能评估基于训练损失、准确性、精度、召回率、f1分数和计算时间。该研究进一步探讨了计算复杂性和优化性能与效率之间的权衡。实验结果表明,Foxtsage的平均损失降低了42.03% (Foxtsage: 9.508, Adam: 16.402),损失标准差降低了42.19% (Foxtsage: 20.86, Adam: 36.085),表明优化的一致性和鲁棒性更强。此外,在正确率(0.78%)、精密度(0.91%)、召回率(1.02%)和f1分数(0.89%)方面也有适度的提高,显示出更好的泛化能力。然而,这些收益是以显著的计算成本为代价的,平均时间增加了330.87% (Foxtsage: 39.541秒,Adam: 9.177秒),这引起了人们对时间敏感应用的实际可行性的担忧。通过有效地将FOX-TSA的全球搜索能力与SGD的自适应稳定性相结合,Foxtsage为神经网络训练提供了一个有前途的替代方案。虽然它增强了性能和健壮性,但它增加的计算开销带来了一个关键的权衡。未来的工作将侧重于降低计算复杂性,提高可扩展性,并探索其在现实世界深度学习任务中的适用性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Foxtsage vs. Adam: Revolution or evolution in optimization?
Optimisation techniques are crucial in neural network training, influencing predictive performance, convergence efficiency, and computational feasibility. Traditional Optimisers such as Adam offer adaptive learning rates but struggle with convergence stability and hyperparameter sensitivity, whereas SGD provides stability but lacks adaptiveness. We propose Foxtsage, a novel hybrid optimisation algorithm that integrates the FOX-TSA (for global search and exploration) with SGD (for fine-tuned local exploitation) to address these limitations. The proposed Foxtsage Optimiser is benchmarked against the widely used Adam Optimiser across multiple standard datasets, including MNIST, IMDB, and CIFAR-10. Performance is evaluated based on training loss, accuracy, precision, recall, F1-score, and computational time. The study further explores computational complexity and the trade-off between optimisation performance and efficiency. Experimental findings demonstrate that Foxtsage achieves a 42.03% reduction in mean loss (Foxtsage: 9.508, Adam: 16.402) and a 42.19% decrease in loss standard deviation (Foxtsage: 20.86, Adam: 36.085), indicating greater consistency and robustness in optimisation. Additionally, modest improvements are observed in accuracy (0.78%), precision (0.91%), recall (1.02%), and F1-score (0.89%), showcasing better generalization capability. However, these gains come at a significant computational cost, with a 330.87% increase in time mean (Foxtsage: 39.541 sec, Adam: 9.177 sec), raising concerns about practical feasibility in time-sensitive applications. By effectively combining FOX-TSA’s global search power with SGD’s adaptive stability, Foxtsage provides a promising alternative for neural network training. While it enhances performance and robustness, its increased computational overhead presents a critical trade-off. Future work will focus on reducing computational complexity, improving scalability, and exploring its applicability in real-world deep-learning tasks.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
Cognitive Systems Research
Cognitive Systems Research 工程技术-计算机:人工智能
CiteScore
9.40
自引率
5.10%
发文量
40
审稿时长
>12 weeks
期刊介绍: Cognitive Systems Research is dedicated to the study of human-level cognition. As such, it welcomes papers which advance the understanding, design and applications of cognitive and intelligent systems, both natural and artificial. The journal brings together a broad community studying cognition in its many facets in vivo and in silico, across the developmental spectrum, focusing on individual capacities or on entire architectures. It aims to foster debate and integrate ideas, concepts, constructs, theories, models and techniques from across different disciplines and different perspectives on human-level cognition. The scope of interest includes the study of cognitive capacities and architectures - both brain-inspired and non-brain-inspired - and the application of cognitive systems to real-world problems as far as it offers insights relevant for the understanding of cognition. Cognitive Systems Research therefore welcomes mature and cutting-edge research approaching cognition from a systems-oriented perspective, both theoretical and empirically-informed, in the form of original manuscripts, short communications, opinion articles, systematic reviews, and topical survey articles from the fields of Cognitive Science (including Philosophy of Cognitive Science), Artificial Intelligence/Computer Science, Cognitive Robotics, Developmental Science, Psychology, and Neuroscience and Neuromorphic Engineering. Empirical studies will be considered if they are supplemented by theoretical analyses and contributions to theory development and/or computational modelling studies.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信