{"title":"随机漫步,有向环和马尔可夫链","authors":"K. Gingell, F. Mendivil","doi":"10.1080/00029890.2022.2144088","DOIUrl":null,"url":null,"abstract":"Abstract A Markov chain is a random process which iteratively travels around in its state space with each transition only depending on the current position and not on the past. When the state space is discrete, we can think of a Markov chain as a special type of random walk on a directed graph. Although a Markov chain normally never settles down but keeps moving around, it does usually have a well-defined limiting behavior in a statistical sense. A given finite directed graph can potentially support many different random walks or Markov chains and each one could have one or more invariant (stationary) distributions. In this paper we explore the question of characterizing the set of all possible invariant distributions. The answer turns out to be quite simple and very natural and involves the cycles on the graph.","PeriodicalId":0,"journal":{"name":"","volume":null,"pages":null},"PeriodicalIF":0.0,"publicationDate":"2022-12-12","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Random Walks, Directed Cycles, and Markov Chains\",\"authors\":\"K. Gingell, F. Mendivil\",\"doi\":\"10.1080/00029890.2022.2144088\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Abstract A Markov chain is a random process which iteratively travels around in its state space with each transition only depending on the current position and not on the past. When the state space is discrete, we can think of a Markov chain as a special type of random walk on a directed graph. Although a Markov chain normally never settles down but keeps moving around, it does usually have a well-defined limiting behavior in a statistical sense. A given finite directed graph can potentially support many different random walks or Markov chains and each one could have one or more invariant (stationary) distributions. In this paper we explore the question of characterizing the set of all possible invariant distributions. The answer turns out to be quite simple and very natural and involves the cycles on the graph.\",\"PeriodicalId\":0,\"journal\":{\"name\":\"\",\"volume\":null,\"pages\":null},\"PeriodicalIF\":0.0,\"publicationDate\":\"2022-12-12\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"\",\"FirstCategoryId\":\"100\",\"ListUrlMain\":\"https://doi.org/10.1080/00029890.2022.2144088\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"","FirstCategoryId":"100","ListUrlMain":"https://doi.org/10.1080/00029890.2022.2144088","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Abstract A Markov chain is a random process which iteratively travels around in its state space with each transition only depending on the current position and not on the past. When the state space is discrete, we can think of a Markov chain as a special type of random walk on a directed graph. Although a Markov chain normally never settles down but keeps moving around, it does usually have a well-defined limiting behavior in a statistical sense. A given finite directed graph can potentially support many different random walks or Markov chains and each one could have one or more invariant (stationary) distributions. In this paper we explore the question of characterizing the set of all possible invariant distributions. The answer turns out to be quite simple and very natural and involves the cycles on the graph.