在线方案上MCMC的少尾数超参数算法:更新超参数在线高斯过程

S. S. Sholihat, S. Indratno, U. Mukhaiyar
{"title":"在线方案上MCMC的少尾数超参数算法:更新超参数在线高斯过程","authors":"S. S. Sholihat, S. Indratno, U. Mukhaiyar","doi":"10.1109/FAIML57028.2022.00011","DOIUrl":null,"url":null,"abstract":"The exciting topic on the Gaussian process is hyper-parameter estimation. The simple and powerful method for the hyper-parameter estimation is Markov Chain Monte Carlo (MCMC). In the internet era, we need to consider the influence of new data coming to hyper-parameter of the Gaussian Process in online manner, updating is needed. Meanwhile, the main problem of running hyper-parameter estimation using MCMC is time-computation. The large size of covariance matrices on the Gaussian process has an expensive time-computing if should be combined to MCMC. We proposed the Less-Last Number Hyper-parameter (LLNH) algorithm for less time computing MCMC. The idea is merging two Markov Chain Monte Carlo (MCMC) sub-posterior algorithm iteratively. The first sub-posterior results recent hyper-parameter estimation. Meanwhile, the second sub-posterior run MCMC for $m$ last data points including new data coming iteratively, $m$-online sub-data ($m$ « data size), using the recent hyper-parameter estimation. The merging is to estimates new hyper-parameter estimation. Technically, MCMC runs m-data size on every iteration for less time-computing. Moreover, MCMC is started by the recent estimation to represent the MCMC involving left data. The algorithm is an efficient method for the hyper-parameter estimation of Gaussian process regression on iterative real-time data. It is applicable to the online scheme. The result showed that the LLNH algorithm performs well on hyper-parameter estimation and has less time-computation comparing to the offline MCMC.","PeriodicalId":307172,"journal":{"name":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","volume":"12 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2022-06-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Less-Last Number Hyperparameter Algorithm for MCMC on Online Scheme: In Updating Hyperparameter Online Gaussian Process\",\"authors\":\"S. S. Sholihat, S. Indratno, U. Mukhaiyar\",\"doi\":\"10.1109/FAIML57028.2022.00011\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The exciting topic on the Gaussian process is hyper-parameter estimation. The simple and powerful method for the hyper-parameter estimation is Markov Chain Monte Carlo (MCMC). In the internet era, we need to consider the influence of new data coming to hyper-parameter of the Gaussian Process in online manner, updating is needed. Meanwhile, the main problem of running hyper-parameter estimation using MCMC is time-computation. The large size of covariance matrices on the Gaussian process has an expensive time-computing if should be combined to MCMC. We proposed the Less-Last Number Hyper-parameter (LLNH) algorithm for less time computing MCMC. The idea is merging two Markov Chain Monte Carlo (MCMC) sub-posterior algorithm iteratively. The first sub-posterior results recent hyper-parameter estimation. Meanwhile, the second sub-posterior run MCMC for $m$ last data points including new data coming iteratively, $m$-online sub-data ($m$ « data size), using the recent hyper-parameter estimation. The merging is to estimates new hyper-parameter estimation. Technically, MCMC runs m-data size on every iteration for less time-computing. Moreover, MCMC is started by the recent estimation to represent the MCMC involving left data. The algorithm is an efficient method for the hyper-parameter estimation of Gaussian process regression on iterative real-time data. It is applicable to the online scheme. The result showed that the LLNH algorithm performs well on hyper-parameter estimation and has less time-computation comparing to the offline MCMC.\",\"PeriodicalId\":307172,\"journal\":{\"name\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"volume\":\"12 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2022-06-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1109/FAIML57028.2022.00011\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"2022 International Conference on Frontiers of Artificial Intelligence and Machine Learning (FAIML)","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1109/FAIML57028.2022.00011","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

高斯过程中最令人兴奋的话题是超参数估计。超参数估计的简单而有效的方法是马尔可夫链蒙特卡罗(MCMC)。在互联网时代,我们需要在线考虑新数据对高斯过程超参数的影响,需要进行更新。同时,使用MCMC进行超参数估计的主要问题是时间计算。在高斯过程中,协方差矩阵的大小较大,计算时间较长,应与MCMC相结合。为了减少计算MCMC的时间,我们提出了less - last Number Hyper-parameter (LLNH)算法。其思想是迭代地合并两个马尔可夫链蒙特卡罗(MCMC)次后验算法。第一个亚后验结果是最近的超参数估计。同时,使用最近的超参数估计,第二个次后验运行MCMC,用于$m$最后一个数据点,包括迭代来的新数据,$m$在线子数据($m$«数据大小)。合并是为了估计新的超参数估计。从技术上讲,MCMC在每次迭代上运行m数据大小,以减少计算时间。此外,MCMC由最近的估计开始表示涉及左数据的MCMC。该算法是对迭代实时数据进行高斯过程回归超参数估计的有效方法。适用于在线方案。结果表明,与离线MCMC相比,LLNH算法具有较好的超参数估计性能,且计算时间短。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Less-Last Number Hyperparameter Algorithm for MCMC on Online Scheme: In Updating Hyperparameter Online Gaussian Process
The exciting topic on the Gaussian process is hyper-parameter estimation. The simple and powerful method for the hyper-parameter estimation is Markov Chain Monte Carlo (MCMC). In the internet era, we need to consider the influence of new data coming to hyper-parameter of the Gaussian Process in online manner, updating is needed. Meanwhile, the main problem of running hyper-parameter estimation using MCMC is time-computation. The large size of covariance matrices on the Gaussian process has an expensive time-computing if should be combined to MCMC. We proposed the Less-Last Number Hyper-parameter (LLNH) algorithm for less time computing MCMC. The idea is merging two Markov Chain Monte Carlo (MCMC) sub-posterior algorithm iteratively. The first sub-posterior results recent hyper-parameter estimation. Meanwhile, the second sub-posterior run MCMC for $m$ last data points including new data coming iteratively, $m$-online sub-data ($m$ « data size), using the recent hyper-parameter estimation. The merging is to estimates new hyper-parameter estimation. Technically, MCMC runs m-data size on every iteration for less time-computing. Moreover, MCMC is started by the recent estimation to represent the MCMC involving left data. The algorithm is an efficient method for the hyper-parameter estimation of Gaussian process regression on iterative real-time data. It is applicable to the online scheme. The result showed that the LLNH algorithm performs well on hyper-parameter estimation and has less time-computation comparing to the offline MCMC.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信