自适应和可训练的激活函数压缩图像的自编码器性能分析

João Henrique De Medeiros Delgado, T. Ferreira
{"title":"自适应和可训练的激活函数压缩图像的自编码器性能分析","authors":"João Henrique De Medeiros Delgado, T. Ferreira","doi":"10.1109/LA-CCI54402.2022.9981644","DOIUrl":null,"url":null,"abstract":"Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.","PeriodicalId":190561,"journal":{"name":"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Autoencoder performance analysis with adaptive and trainable activation function to compress images\",\"authors\":\"João Henrique De Medeiros Delgado, T. Ferreira\",\"doi\":\"10.1109/LA-CCI54402.2022.9981644\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.\",\"PeriodicalId\":190561,\"journal\":{\"name\":\"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/LA-CCI54402.2022.9981644\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LA-CCI54402.2022.9981644","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

激活函数是神经网络良好性能的关键。可以使用许多函数,选择使用哪个函数取决于要解决的问题。近年来,人们研究了新的自适应、可训练的激活函数来提高网络性能。本研究旨在评估在自编码器网络中使用自适应和可训练函数的人工神经元的性能,以解决图像压缩问题。被测试的神经元被称为全局-局部神经元,由两个互补的部分组成,一个具有全局特征,另一个具有局部特征。全局分量由正弦函数给出,局部分量由双曲正切函数给出。实验分两个阶段进行。在第一篇文章中,我们在mlp型自编码器神经网络模型中测试了不同的激活函数GLN、Tanh和Sine。在改变Autoencoder瓶颈层的大小时,考虑了不同的压缩比,每个瓶颈层的值得到48个样本。用于评估的指标是在测试集中获得的损失值和达到停止准则所需的epoch数。第二步,利用宽残差网络(WRN)网络和支持向量机(SVM)方法对前一模型的编码器块压缩后的图像进行分类精度评估。结果表明,使用Global-Local Neuron提高了网络的训练速度,在WRN网络中获得了更好的分类精度,压缩率可达50%,证明了其对图像分类问题的适应性。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Autoencoder performance analysis with adaptive and trainable activation function to compress images
Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信