{"title":"Forgetting factor multi-error stochastic gradient algorithm based on minimum error entropy","authors":"Shaoxue Jing","doi":"10.1109/IAI50351.2020.9262232","DOIUrl":null,"url":null,"abstract":"Entropy has been widely applied in system identification in the last decade. In this paper, a novel stochastic gradient algorithm based on minimum entropy is proposed. Though needing less computation than the mean squares error algorithm, traditional stochastic gradient algorithm converges quite slowly. To fasten the algorithm, a multi-error method and a forgetting factor are integrated into the algorithm. Firstly, the scalar error is replaced by a vector error with different error length. Secondly, a forgetting factor is adopted to calculate the step size. The proposed algorithm is utilized to estimate the parameters of a finite impulse response model. Estimation results indicate that the proposed algorithm can obtain more accurate estimates than traditional gradient algorithm and has a faster converge speed.","PeriodicalId":137183,"journal":{"name":"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)","volume":"51 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-10-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2020 2nd International Conference on Industrial Artificial Intelligence (IAI)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/IAI50351.2020.9262232","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Entropy has been widely applied in system identification in the last decade. In this paper, a novel stochastic gradient algorithm based on minimum entropy is proposed. Though needing less computation than the mean squares error algorithm, traditional stochastic gradient algorithm converges quite slowly. To fasten the algorithm, a multi-error method and a forgetting factor are integrated into the algorithm. Firstly, the scalar error is replaced by a vector error with different error length. Secondly, a forgetting factor is adopted to calculate the step size. The proposed algorithm is utilized to estimate the parameters of a finite impulse response model. Estimation results indicate that the proposed algorithm can obtain more accurate estimates than traditional gradient algorithm and has a faster converge speed.