{"title":"RNN框架下SSAG的有效性研究","authors":"Xiaowei Xie, Aixiang Chen","doi":"10.1109/IAEAC54830.2022.9929855","DOIUrl":null,"url":null,"abstract":"SGD (Stochastic gradient descent) is widely used in deep learning, however SGD cannot get linear convergence and is not effective in large amounts of data. This paper use SSAG to improve the efficiency. SSAG contains two optimization strategies, one is stratified sampling strategy and the other is historical gradient averaging strategy. It has the advantages of fast convergence of variance, flexible application to big data, and easy work in deep network. This paper studies the efficiency of SSAG gradient optimization algorithm based on RNN framework. The proposed RNN framework comprises a feature extraction layer, a stacked RNN layer, and a transcription layer. The experimental results confirm that the accuracy of SSAG is better than the SGD and the Momentum. Both stratified sampling and historical averaging strategies have the effect of improving task accuracy. Experimental results verified that SSAG has better effect in image classification task.","PeriodicalId":349113,"journal":{"name":"2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )","volume":"156 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-10-03","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Efficiency Study of SSAG on RNN Framework\",\"authors\":\"Xiaowei Xie, Aixiang Chen\",\"doi\":\"10.1109/IAEAC54830.2022.9929855\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"SGD (Stochastic gradient descent) is widely used in deep learning, however SGD cannot get linear convergence and is not effective in large amounts of data. This paper use SSAG to improve the efficiency. SSAG contains two optimization strategies, one is stratified sampling strategy and the other is historical gradient averaging strategy. It has the advantages of fast convergence of variance, flexible application to big data, and easy work in deep network. This paper studies the efficiency of SSAG gradient optimization algorithm based on RNN framework. The proposed RNN framework comprises a feature extraction layer, a stacked RNN layer, and a transcription layer. The experimental results confirm that the accuracy of SSAG is better than the SGD and the Momentum. Both stratified sampling and historical averaging strategies have the effect of improving task accuracy. Experimental results verified that SSAG has better effect in image classification task.\",\"PeriodicalId\":349113,\"journal\":{\"name\":\"2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )\",\"volume\":\"156 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-10-03\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/IAEAC54830.2022.9929855\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 IEEE 6th Advanced Information Technology, Electronic and Automation Control Conference (IAEAC )","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAEAC54830.2022.9929855","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
SGD (Stochastic gradient descent) is widely used in deep learning, however SGD cannot get linear convergence and is not effective in large amounts of data. This paper use SSAG to improve the efficiency. SSAG contains two optimization strategies, one is stratified sampling strategy and the other is historical gradient averaging strategy. It has the advantages of fast convergence of variance, flexible application to big data, and easy work in deep network. This paper studies the efficiency of SSAG gradient optimization algorithm based on RNN framework. The proposed RNN framework comprises a feature extraction layer, a stacked RNN layer, and a transcription layer. The experimental results confirm that the accuracy of SSAG is better than the SGD and the Momentum. Both stratified sampling and historical averaging strategies have the effect of improving task accuracy. Experimental results verified that SSAG has better effect in image classification task.