{"title":"基于非标准分析的马尔可夫过程遍历性","authors":"Haosui Duanmu, J. Rosenthal, W. Weiss","doi":"10.1090/memo/1342","DOIUrl":null,"url":null,"abstract":"The Markov chain ergodic theorem is well-understood if either the time-line or the state space is discrete. However, there does not exist a very clear result for general state space continuous-time Markov processes. Using methods from mathematical logic and nonstandard analysis, we introduce a class of hyperfinite Markov processes-namely, general Markov processes which behave like finite state space discrete-time Markov processes. We show that, under moderate conditions, the transition probability of hyperfinite Markov processes align with the transition probability of standard Markov processes. The Markov chain ergodic theorem for hyperfinite Markov processes will then imply the Markov chain ergodic theorem for general state space continuous-time Markov processes.","PeriodicalId":2,"journal":{"name":"ACS Applied Bio Materials","volume":null,"pages":null},"PeriodicalIF":4.6000,"publicationDate":"2021-09-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"10","resultStr":"{\"title\":\"Ergodicity of Markov Processes via Nonstandard Analysis\",\"authors\":\"Haosui Duanmu, J. Rosenthal, W. Weiss\",\"doi\":\"10.1090/memo/1342\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The Markov chain ergodic theorem is well-understood if either the time-line or the state space is discrete. However, there does not exist a very clear result for general state space continuous-time Markov processes. Using methods from mathematical logic and nonstandard analysis, we introduce a class of hyperfinite Markov processes-namely, general Markov processes which behave like finite state space discrete-time Markov processes. We show that, under moderate conditions, the transition probability of hyperfinite Markov processes align with the transition probability of standard Markov processes. The Markov chain ergodic theorem for hyperfinite Markov processes will then imply the Markov chain ergodic theorem for general state space continuous-time Markov processes.\",\"PeriodicalId\":2,\"journal\":{\"name\":\"ACS Applied Bio Materials\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":4.6000,\"publicationDate\":\"2021-09-01\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"10\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"ACS Applied Bio Materials\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1090/memo/1342\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"Q2\",\"JCRName\":\"MATERIALS SCIENCE, BIOMATERIALS\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"ACS Applied Bio Materials","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1090/memo/1342","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"Q2","JCRName":"MATERIALS SCIENCE, BIOMATERIALS","Score":null,"Total":0}
Ergodicity of Markov Processes via Nonstandard Analysis
The Markov chain ergodic theorem is well-understood if either the time-line or the state space is discrete. However, there does not exist a very clear result for general state space continuous-time Markov processes. Using methods from mathematical logic and nonstandard analysis, we introduce a class of hyperfinite Markov processes-namely, general Markov processes which behave like finite state space discrete-time Markov processes. We show that, under moderate conditions, the transition probability of hyperfinite Markov processes align with the transition probability of standard Markov processes. The Markov chain ergodic theorem for hyperfinite Markov processes will then imply the Markov chain ergodic theorem for general state space continuous-time Markov processes.