An effective one-iteration learning algorithm based on Gaussian mixture expansion for densities

IF 3.4 2区 数学 Q1 MATHEMATICS, APPLIED
Weiguo Lu, Xuan Wu, Deng Ding, Gangnan Yuan, Jirong Zhuang
{"title":"An effective one-iteration learning algorithm based on Gaussian mixture expansion for densities","authors":"Weiguo Lu, Xuan Wu, Deng Ding, Gangnan Yuan, Jirong Zhuang","doi":"10.1016/j.cnsns.2024.108494","DOIUrl":null,"url":null,"abstract":"In this study, we utilize Gaussian Mixture Model (GMM) and propose a novel learn algorithm to approximate any density in a fast and simple way. In our previous study, we proposed a idea called GMM expansion which inspired by Fourier expansion. Similar to the base of frequencies in Fourier expansion, GMM expansion assume that normal distributions can be placed evenly along the support as a set of bases to approximate a large set of distribution in good accuracy. In this work, a new algorithm is proposed base on the idea of GMM expansion. A theoretical analysis also given to verify the convergence. Various experiments are carried out to exam the efficacy of proposed method. Experiment result demonstrate the advantages of proposed method and support that this new algorithm perform faster, is more accurate, has better stability, and is easier to use than the Expectation Maximization (EM) algorithm. Furthermore, the benefits of this proposed method helps improve the integration of GMM in neural network. The experiment results show that the neural network with our proposed method significantly improves ability to handle the inverse problem and data uncertainty. Finally, another application, a GMM-based neural network generator, is built. This application shows the potential to utilize distribution random sampling for feature variation control in generative mode.","PeriodicalId":50658,"journal":{"name":"Communications in Nonlinear Science and Numerical Simulation","volume":"1 1","pages":""},"PeriodicalIF":3.4000,"publicationDate":"2024-11-30","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Communications in Nonlinear Science and Numerical Simulation","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1016/j.cnsns.2024.108494","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q1","JCRName":"MATHEMATICS, APPLIED","Score":null,"Total":0}
引用次数: 0

Abstract

In this study, we utilize Gaussian Mixture Model (GMM) and propose a novel learn algorithm to approximate any density in a fast and simple way. In our previous study, we proposed a idea called GMM expansion which inspired by Fourier expansion. Similar to the base of frequencies in Fourier expansion, GMM expansion assume that normal distributions can be placed evenly along the support as a set of bases to approximate a large set of distribution in good accuracy. In this work, a new algorithm is proposed base on the idea of GMM expansion. A theoretical analysis also given to verify the convergence. Various experiments are carried out to exam the efficacy of proposed method. Experiment result demonstrate the advantages of proposed method and support that this new algorithm perform faster, is more accurate, has better stability, and is easier to use than the Expectation Maximization (EM) algorithm. Furthermore, the benefits of this proposed method helps improve the integration of GMM in neural network. The experiment results show that the neural network with our proposed method significantly improves ability to handle the inverse problem and data uncertainty. Finally, another application, a GMM-based neural network generator, is built. This application shows the potential to utilize distribution random sampling for feature variation control in generative mode.
求助全文
约1分钟内获得全文 求助全文
来源期刊
Communications in Nonlinear Science and Numerical Simulation
Communications in Nonlinear Science and Numerical Simulation MATHEMATICS, APPLIED-MATHEMATICS, INTERDISCIPLINARY APPLICATIONS
CiteScore
6.80
自引率
7.70%
发文量
378
审稿时长
78 days
期刊介绍: The journal publishes original research findings on experimental observation, mathematical modeling, theoretical analysis and numerical simulation, for more accurate description, better prediction or novel application, of nonlinear phenomena in science and engineering. It offers a venue for researchers to make rapid exchange of ideas and techniques in nonlinear science and complexity. The submission of manuscripts with cross-disciplinary approaches in nonlinear science and complexity is particularly encouraged. Topics of interest: Nonlinear differential or delay equations, Lie group analysis and asymptotic methods, Discontinuous systems, Fractals, Fractional calculus and dynamics, Nonlinear effects in quantum mechanics, Nonlinear stochastic processes, Experimental nonlinear science, Time-series and signal analysis, Computational methods and simulations in nonlinear science and engineering, Control of dynamical systems, Synchronization, Lyapunov analysis, High-dimensional chaos and turbulence, Chaos in Hamiltonian systems, Integrable systems and solitons, Collective behavior in many-body systems, Biological physics and networks, Nonlinear mechanical systems, Complex systems and complexity. No length limitation for contributions is set, but only concisely written manuscripts are published. Brief papers are published on the basis of Rapid Communications. Discussions of previously published papers are welcome.
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信