{"title":"部分观测随机伏特拉方程的贝叶斯参数推断","authors":"Ajay Jasra, Hamza Ruzayqat, Amin Wu","doi":"10.1007/s11222-024-10389-6","DOIUrl":null,"url":null,"abstract":"<p>In this article we consider Bayesian parameter inference for a type of partially observed stochastic Volterra equation (SVE). SVEs are found in many areas such as physics and mathematical finance. In the latter field they can be used to represent long memory in unobserved volatility processes. In many cases of practical interest, SVEs must be time-discretized and then parameter inference is based upon the posterior associated to this time-discretized process. Based upon recent studies on time-discretization of SVEs (e.g. Richard et al. in Stoch Proc Appl 141:109–138, 2021) we use Euler–Maruyama methods for the afore-mentioned discretization. We then show how multilevel Markov chain Monte Carlo (MCMC) methods (Jasra et al. in SIAM J Sci Comp 40:A887–A902, 2018) can be applied in this context. In the examples we study, we give a proof that shows that the cost to achieve a mean square error (MSE) of <span>\\(\\mathcal {O}(\\epsilon ^2)\\)</span>, <span>\\(\\epsilon >0\\)</span>, is <span>\\(\\mathcal {O}(\\epsilon ^{-\\tfrac{4}{2H+1}})\\)</span>, where <i>H</i> is the Hurst parameter. If one uses a single level MCMC method then the cost is <span>\\(\\mathcal {O}(\\epsilon ^{-\\tfrac{2(2H+3)}{2H+1}})\\)</span> to achieve the same MSE. We illustrate these results in the context of state-space and stochastic volatility models, with the latter applied to real data.</p>","PeriodicalId":22058,"journal":{"name":"Statistics and Computing","volume":"19 1","pages":""},"PeriodicalIF":1.6000,"publicationDate":"2024-02-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Bayesian parameter inference for partially observed stochastic volterra equations\",\"authors\":\"Ajay Jasra, Hamza Ruzayqat, Amin Wu\",\"doi\":\"10.1007/s11222-024-10389-6\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"<p>In this article we consider Bayesian parameter inference for a type of partially observed stochastic Volterra equation (SVE). SVEs are found in many areas such as physics and mathematical finance. In the latter field they can be used to represent long memory in unobserved volatility processes. In many cases of practical interest, SVEs must be time-discretized and then parameter inference is based upon the posterior associated to this time-discretized process. Based upon recent studies on time-discretization of SVEs (e.g. Richard et al. in Stoch Proc Appl 141:109–138, 2021) we use Euler–Maruyama methods for the afore-mentioned discretization. We then show how multilevel Markov chain Monte Carlo (MCMC) methods (Jasra et al. in SIAM J Sci Comp 40:A887–A902, 2018) can be applied in this context. In the examples we study, we give a proof that shows that the cost to achieve a mean square error (MSE) of <span>\\\\(\\\\mathcal {O}(\\\\epsilon ^2)\\\\)</span>, <span>\\\\(\\\\epsilon >0\\\\)</span>, is <span>\\\\(\\\\mathcal {O}(\\\\epsilon ^{-\\\\tfrac{4}{2H+1}})\\\\)</span>, where <i>H</i> is the Hurst parameter. If one uses a single level MCMC method then the cost is <span>\\\\(\\\\mathcal {O}(\\\\epsilon ^{-\\\\tfrac{2(2H+3)}{2H+1}})\\\\)</span> to achieve the same MSE. We illustrate these results in the context of state-space and stochastic volatility models, with the latter applied to real data.</p>\",\"PeriodicalId\":22058,\"journal\":{\"name\":\"Statistics and Computing\",\"volume\":\"19 1\",\"pages\":\"\"},\"PeriodicalIF\":1.6000,\"publicationDate\":\"2024-02-14\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Statistics and Computing\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1007/s11222-024-10389-6\",\"RegionNum\":2,\"RegionCategory\":\"数学\",\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"COMPUTER SCIENCE, THEORY & METHODS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Statistics and Computing","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1007/s11222-024-10389-6","RegionNum":2,"RegionCategory":"数学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"COMPUTER SCIENCE, THEORY & METHODS","Score":null,"Total":0}
Bayesian parameter inference for partially observed stochastic volterra equations
In this article we consider Bayesian parameter inference for a type of partially observed stochastic Volterra equation (SVE). SVEs are found in many areas such as physics and mathematical finance. In the latter field they can be used to represent long memory in unobserved volatility processes. In many cases of practical interest, SVEs must be time-discretized and then parameter inference is based upon the posterior associated to this time-discretized process. Based upon recent studies on time-discretization of SVEs (e.g. Richard et al. in Stoch Proc Appl 141:109–138, 2021) we use Euler–Maruyama methods for the afore-mentioned discretization. We then show how multilevel Markov chain Monte Carlo (MCMC) methods (Jasra et al. in SIAM J Sci Comp 40:A887–A902, 2018) can be applied in this context. In the examples we study, we give a proof that shows that the cost to achieve a mean square error (MSE) of \(\mathcal {O}(\epsilon ^2)\), \(\epsilon >0\), is \(\mathcal {O}(\epsilon ^{-\tfrac{4}{2H+1}})\), where H is the Hurst parameter. If one uses a single level MCMC method then the cost is \(\mathcal {O}(\epsilon ^{-\tfrac{2(2H+3)}{2H+1}})\) to achieve the same MSE. We illustrate these results in the context of state-space and stochastic volatility models, with the latter applied to real data.
期刊介绍:
Statistics and Computing is a bi-monthly refereed journal which publishes papers covering the range of the interface between the statistical and computing sciences.
In particular, it addresses the use of statistical concepts in computing science, for example in machine learning, computer vision and data analytics, as well as the use of computers in data modelling, prediction and analysis. Specific topics which are covered include: techniques for evaluating analytically intractable problems such as bootstrap resampling, Markov chain Monte Carlo, sequential Monte Carlo, approximate Bayesian computation, search and optimization methods, stochastic simulation and Monte Carlo, graphics, computer environments, statistical approaches to software errors, information retrieval, machine learning, statistics of databases and database technology, huge data sets and big data analytics, computer algebra, graphical models, image processing, tomography, inverse problems and uncertainty quantification.
In addition, the journal contains original research reports, authoritative review papers, discussed papers, and occasional special issues on particular topics or carrying proceedings of relevant conferences. Statistics and Computing also publishes book review and software review sections.