Activation function cyclically switchable convolutional neural network model.

IF 3.5 4区 计算机科学 Q2 COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE
PeerJ Computer Science Pub Date : 2025-03-24 eCollection Date: 2025-01-01 DOI:10.7717/peerj-cs.2756
İsmail Akgül
{"title":"Activation function cyclically switchable convolutional neural network model.","authors":"İsmail Akgül","doi":"10.7717/peerj-cs.2756","DOIUrl":null,"url":null,"abstract":"<p><p>Neural networks are a state-of-the-art approach that performs well for many tasks. The activation function (AF) is an important hyperparameter that creates an output against the coming inputs to the neural network model. AF significantly affects the training and performance of the neural network model. Therefore, selecting the most optimal AF for processing input data in neural networks is important. Determining the optimal AF is often a difficult task. To overcome this difficulty, studies on trainable AFs have been carried out in the literature in recent years. This study presents a different approach apart from fixed or trainable AF approaches. For this purpose, the activation function cyclically switchable convolutional neural network (AFCS-CNN) model structure is proposed. The AFCS-CNN model structure does not use a fixed AF value during training. It is designed in a self-regulating model structure by switching the AF during model training. The proposed model structure is based on the logic of starting training with the most optimal AF selection among many AFs and cyclically selecting the next most optimal AF depending on the performance decrease during neural network training. Any convolutional neural network (CNN) model can be easily used in the proposed model structure. In this way, a simple but effective perspective has been presented. In this study, first, ablation studies have been carried out using the Cifar-10 dataset to determine the CNN models to be used in the AFCS-CNN model structure and the specific hyperparameters of the proposed model structure. After the models and hyperparameters were determined, expansion experiments were carried out using different datasets with the proposed model structure. The results showed that the AFCS-CNN model structure achieved state-of-the-art success in many CNN models and different datasets.</p>","PeriodicalId":54224,"journal":{"name":"PeerJ Computer Science","volume":"11 ","pages":"e2756"},"PeriodicalIF":3.5000,"publicationDate":"2025-03-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC11948315/pdf/","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"PeerJ Computer Science","FirstCategoryId":"94","ListUrlMain":"https://doi.org/10.7717/peerj-cs.2756","RegionNum":4,"RegionCategory":"计算机科学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"2025/1/1 0:00:00","PubModel":"eCollection","JCR":"Q2","JCRName":"COMPUTER SCIENCE, ARTIFICIAL INTELLIGENCE","Score":null,"Total":0}
引用次数: 0

Abstract

Neural networks are a state-of-the-art approach that performs well for many tasks. The activation function (AF) is an important hyperparameter that creates an output against the coming inputs to the neural network model. AF significantly affects the training and performance of the neural network model. Therefore, selecting the most optimal AF for processing input data in neural networks is important. Determining the optimal AF is often a difficult task. To overcome this difficulty, studies on trainable AFs have been carried out in the literature in recent years. This study presents a different approach apart from fixed or trainable AF approaches. For this purpose, the activation function cyclically switchable convolutional neural network (AFCS-CNN) model structure is proposed. The AFCS-CNN model structure does not use a fixed AF value during training. It is designed in a self-regulating model structure by switching the AF during model training. The proposed model structure is based on the logic of starting training with the most optimal AF selection among many AFs and cyclically selecting the next most optimal AF depending on the performance decrease during neural network training. Any convolutional neural network (CNN) model can be easily used in the proposed model structure. In this way, a simple but effective perspective has been presented. In this study, first, ablation studies have been carried out using the Cifar-10 dataset to determine the CNN models to be used in the AFCS-CNN model structure and the specific hyperparameters of the proposed model structure. After the models and hyperparameters were determined, expansion experiments were carried out using different datasets with the proposed model structure. The results showed that the AFCS-CNN model structure achieved state-of-the-art success in many CNN models and different datasets.

激活函数循环切换卷积神经网络模型。
神经网络是一种最先进的方法,在许多任务中表现良好。激活函数(AF)是一个重要的超参数,它根据神经网络模型的输入创建一个输出。AF显著影响神经网络模型的训练和性能。因此,选择最优AF来处理神经网络中的输入数据是很重要的。确定最佳自动对焦通常是一项困难的任务。为了克服这一困难,近年来文献中开展了可训练af的研究。本研究提出了一种不同于固定或可训练AF方法的方法。为此,提出了激活函数循环可切换卷积神经网络(AFCS-CNN)模型结构。AFCS-CNN模型结构在训练时不使用固定的AF值。通过在模型训练过程中切换AF,将其设计为自调节模型结构。所提出的模型结构基于从众多AF中选择最优AF开始训练,并根据神经网络训练过程中性能下降情况循环选择下一个最优AF的逻辑。任何卷积神经网络(CNN)模型都可以很容易地用于所提出的模型结构。通过这种方式,呈现了一个简单而有效的视角。本研究首先利用Cifar-10数据集进行烧蚀研究,确定AFCS-CNN模型结构中使用的CNN模型以及所提出模型结构的具体超参数。在确定模型和超参数后,利用所提出的模型结构在不同的数据集上进行扩展实验。结果表明,AFCS-CNN模型结构在许多CNN模型和不同的数据集上取得了最先进的成功。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
PeerJ Computer Science
PeerJ Computer Science Computer Science-General Computer Science
CiteScore
6.10
自引率
5.30%
发文量
332
审稿时长
10 weeks
期刊介绍: PeerJ Computer Science is the new open access journal covering all subject areas in computer science, with the backing of a prestigious advisory board and more than 300 academic editors.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信