{"title":"自适应和可训练的激活函数压缩图像的自编码器性能分析","authors":"João Henrique De Medeiros Delgado, T. Ferreira","doi":"10.1109/LA-CCI54402.2022.9981644","DOIUrl":null,"url":null,"abstract":"Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.","PeriodicalId":190561,"journal":{"name":"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-11-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Autoencoder performance analysis with adaptive and trainable activation function to compress images\",\"authors\":\"João Henrique De Medeiros Delgado, T. Ferreira\",\"doi\":\"10.1109/LA-CCI54402.2022.9981644\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.\",\"PeriodicalId\":190561,\"journal\":{\"name\":\"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-11-23\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/LA-CCI54402.2022.9981644\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE Latin American Conference on Computational Intelligence (LA-CCI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/LA-CCI54402.2022.9981644","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Autoencoder performance analysis with adaptive and trainable activation function to compress images
Activation functions are essential keys to good performance in a neural network. Many functions can be used, and the choice of which one to use depends on the issues addressed. New adaptable and trainable activation functions have been studied lately, which are used to increase network performance. This study intends to evaluate the performance of an artificial neuron that uses adaptive and trainable functions in an Autoencoder network for image compression problems. The tested neuron, known as Global-Local Neuron, comprises two complementary components, one with global characteristics and the other with local characteristics. The global component is given by a sine function and the local component by the hyperbolic tangent function. The experiment was carried out in two stages. In the first one, different activation functions, GLN, Tanh, and Sine, were tested in an MLP-type Autoencoder neural network model. Different compression ratios were considered when varying the size of the Autoencoder bottleneck layer, and 48 samples were obtained for each value of this layer. The metrics used for the evaluation were the loss value obtained in the test set and the number of epochs necessary to reach a stopping criterion. In the second step, the classification accuracy of the images compressed by the encoder block of the previous model was evaluated, using a Wide Residual Networks (WRN) network and the Support Vector Machines (SVM) method. The results obtained indicated that the use of Global-Local Neuron improved the network training speed, obtained better classification accuracy for compression up to 50% in a WRN network, and proved the adaptability in image classification problems.