Shen-Jie Tang , Yu Tang , Xi-Feng Li , Bo Liu , Dong-Jie Bi , Guo Yi , Xue-Peng Zheng , Li-Biao Peng , Yong-Le Xie
{"title":"基于最小对数双曲余弦损失的Nyström核算法","authors":"Shen-Jie Tang , Yu Tang , Xi-Feng Li , Bo Liu , Dong-Jie Bi , Guo Yi , Xue-Peng Zheng , Li-Biao Peng , Yong-Le Xie","doi":"10.1016/j.jnlest.2023.100217","DOIUrl":null,"url":null,"abstract":"<div><p>Kernel adaptive filters (KAFs) have sparked substantial attraction for online non-linear learning applications. It is noted that the effectiveness of KAFs is highly reliant on a rational learning criterion. Concerning this, the logarithmic hyperbolic cosine (lncosh) criterion with better robustness and convergence has drawn attention in recent studies. However, existing lncosh loss-based KAFs use the stochastic gradient descent (SGD) for optimization, which lack a trade-off between the convergence speed and accuracy. But recursion-based KAFs can provide more effective filtering performance. Therefore, a Nyström method-based robust sparse kernel recursive least lncosh loss algorithm is derived in this article. Experiments via measures and synthetic data against the non-Gaussian noise confirm the superiority with regard to the robustness, accuracy performance, and computational cost.</p></div>","PeriodicalId":53467,"journal":{"name":"Journal of Electronic Science and Technology","volume":"21 3","pages":"Article 100217"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Nyström kernel algorithm based on least logarithmic hyperbolic cosine loss\",\"authors\":\"Shen-Jie Tang , Yu Tang , Xi-Feng Li , Bo Liu , Dong-Jie Bi , Guo Yi , Xue-Peng Zheng , Li-Biao Peng , Yong-Le Xie\",\"doi\":\"10.1016/j.jnlest.2023.100217\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<div><p>Kernel adaptive filters (KAFs) have sparked substantial attraction for online non-linear learning applications. It is noted that the effectiveness of KAFs is highly reliant on a rational learning criterion. Concerning this, the logarithmic hyperbolic cosine (lncosh) criterion with better robustness and convergence has drawn attention in recent studies. However, existing lncosh loss-based KAFs use the stochastic gradient descent (SGD) for optimization, which lack a trade-off between the convergence speed and accuracy. But recursion-based KAFs can provide more effective filtering performance. Therefore, a Nyström method-based robust sparse kernel recursive least lncosh loss algorithm is derived in this article. Experiments via measures and synthetic data against the non-Gaussian noise confirm the superiority with regard to the robustness, accuracy performance, and computational cost.</p></div>\",\"PeriodicalId\":53467,\"journal\":{\"name\":\"Journal of Electronic Science and Technology\",\"volume\":\"21 3\",\"pages\":\"Article 100217\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2023-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Journal of Electronic Science and Technology\",\"FirstCategoryId\":\"95\",\"ListUrlMain\":\"https://www.sciencedirect.com/science/article/pii/S1674862X23000356\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q1\",\"JCRName\":\"Engineering\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Electronic Science and Technology","FirstCategoryId":"95","ListUrlMain":"https://www.sciencedirect.com/science/article/pii/S1674862X23000356","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"Engineering","Score":null,"Total":0}
Nyström kernel algorithm based on least logarithmic hyperbolic cosine loss
Kernel adaptive filters (KAFs) have sparked substantial attraction for online non-linear learning applications. It is noted that the effectiveness of KAFs is highly reliant on a rational learning criterion. Concerning this, the logarithmic hyperbolic cosine (lncosh) criterion with better robustness and convergence has drawn attention in recent studies. However, existing lncosh loss-based KAFs use the stochastic gradient descent (SGD) for optimization, which lack a trade-off between the convergence speed and accuracy. But recursion-based KAFs can provide more effective filtering performance. Therefore, a Nyström method-based robust sparse kernel recursive least lncosh loss algorithm is derived in this article. Experiments via measures and synthetic data against the non-Gaussian noise confirm the superiority with regard to the robustness, accuracy performance, and computational cost.
期刊介绍:
JEST (International) covers the state-of-the-art achievements in electronic science and technology, including the most highlight areas: ¨ Communication Technology ¨ Computer Science and Information Technology ¨ Information and Network Security ¨ Bioelectronics and Biomedicine ¨ Neural Networks and Intelligent Systems ¨ Electronic Systems and Array Processing ¨ Optoelectronic and Photonic Technologies ¨ Electronic Materials and Devices ¨ Sensing and Measurement ¨ Signal Processing and Image Processing JEST (International) is dedicated to building an open, high-level academic journal supported by researchers, professionals, and academicians. The Journal has been fully indexed by Ei INSPEC and has published, with great honor, the contributions from more than 20 countries and regions in the world.