A step gravitational search algorithm for function optimization and STTM’s synchronous feature selection-parameter optimization

IF 13.9 2区 计算机科学 Q1 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
Chaodong Fan, Laurence T. Yang, Leyi Xiao
{"title":"A step gravitational search algorithm for function optimization and STTM’s synchronous feature selection-parameter optimization","authors":"Chaodong Fan,&nbsp;Laurence T. Yang,&nbsp;Leyi Xiao","doi":"10.1007/s10462-025-11193-y","DOIUrl":null,"url":null,"abstract":"<div><p>The support tensor train machine (STTM) can make full use of the correlation of tensor data structures, while the parameter training is inefficient and feature redundancy is large. For this, a step gravitational search algorithm (SGSA) is proposed and used for synchronous feature selection and parameter optimization of STTM in this paper. Since the single population structure of the gravitational search algorithm is difficult to balance exploration and exploitation effectively, a new dual population structure is defined by the step function. Subpopulation Pop1 focuses on exploration, and a <i>K</i><sub><i>best</i></sub><i>-Elite</i> hybrid learning strategy is designed to avoid the rapid decline of exploration ability due to the rapid reduction of the size of <i>K</i><sub><i>best</i></sub> set as well as the gravitational constant <i>G</i>. Subpopulation Pop2 focuses on exploitation, and a position update strategy that integrates Cauchy distribution and Gaussian distribution is designed to make Pop2 always have a certain exploration ability. Finally, use SGSA to solve the synchronous feature selection and parameter optimization problem of STTM (the resulting model is denoted as SGSA-STTM). The algorithm’s optimization performance test results show that SGSA can obtain relatively best results on most test functions compared with other state-of-the-art algorithms. The classification performance test on fMRI datasets shows that SGSA-STTM can remove more than 40% of redundant features on most datasets, which can effectively improve the efficiency of the algorithm, and the classification accuracy for the StarPlus fMRI dataset and the CMU Science 2008 fMRI dataset reached 60 and 70%, respectively.</p></div>","PeriodicalId":8449,"journal":{"name":"Artificial Intelligence Review","volume":"58 6","pages":""},"PeriodicalIF":13.9000,"publicationDate":"2025-03-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://link.springer.com/content/pdf/10.1007/s10462-025-11193-y.pdf","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Artificial Intelligence Review","FirstCategoryId":"94","ListUrlMain":"https://link.springer.com/article/10.1007/s10462-025-11193-y","RegionNum":2,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

The support tensor train machine (STTM) can make full use of the correlation of tensor data structures, while the parameter training is inefficient and feature redundancy is large. For this, a step gravitational search algorithm (SGSA) is proposed and used for synchronous feature selection and parameter optimization of STTM in this paper. Since the single population structure of the gravitational search algorithm is difficult to balance exploration and exploitation effectively, a new dual population structure is defined by the step function. Subpopulation Pop1 focuses on exploration, and a Kbest-Elite hybrid learning strategy is designed to avoid the rapid decline of exploration ability due to the rapid reduction of the size of Kbest set as well as the gravitational constant G. Subpopulation Pop2 focuses on exploitation, and a position update strategy that integrates Cauchy distribution and Gaussian distribution is designed to make Pop2 always have a certain exploration ability. Finally, use SGSA to solve the synchronous feature selection and parameter optimization problem of STTM (the resulting model is denoted as SGSA-STTM). The algorithm’s optimization performance test results show that SGSA can obtain relatively best results on most test functions compared with other state-of-the-art algorithms. The classification performance test on fMRI datasets shows that SGSA-STTM can remove more than 40% of redundant features on most datasets, which can effectively improve the efficiency of the algorithm, and the classification accuracy for the StarPlus fMRI dataset and the CMU Science 2008 fMRI dataset reached 60 and 70%, respectively.

函数优化的步进引力搜索算法和STTM的同步特征选择参数优化
支持张量训练机(STTM)可以充分利用张量数据结构之间的相关性,但参数训练效率低,特征冗余大。为此,本文提出了一种阶梯引力搜索算法(SGSA),并将其用于STTM的同步特征选择和参数优化。针对引力搜索算法的单一种群结构难以有效平衡勘探和开采的问题,利用阶跃函数定义了一种新的双种群结构。子种群Pop1以探索为重点,设计了Kbest- elite混合学习策略,避免了由于Kbest集大小和重力常数g的快速减小而导致的探索能力的快速下降。子种群Pop2以开发为重点,设计了柯西分布和高斯分布相结合的位置更新策略,使Pop2始终具有一定的探索能力。最后,利用SGSA解决STTM的同步特征选择和参数优化问题(得到的模型记为SGSA-STTM)。算法的优化性能测试结果表明,与其他最先进的算法相比,SGSA在大多数测试函数上都能获得相对最好的结果。在fMRI数据集上的分类性能测试表明,SGSA-STTM可以在大多数数据集上去除40%以上的冗余特征,有效地提高了算法的效率,对StarPlus fMRI数据集和CMU Science 2008 fMRI数据集的分类准确率分别达到60%和70%。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
Artificial Intelligence Review
Artificial Intelligence Review 工程技术-计算机:人工智能
CiteScore
22.00
自引率
3.30%
发文量
194
审稿时长
5.3 months
期刊介绍: Artificial Intelligence Review, a fully open access journal, publishes cutting-edge research in artificial intelligence and cognitive science. It features critical evaluations of applications, techniques, and algorithms, providing a platform for both researchers and application developers. The journal includes refereed survey and tutorial articles, along with reviews and commentary on significant developments in the field.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信