{"title":"Markov Chains","authors":"Kevin J. Hastings","doi":"10.1201/9781315275987-4","DOIUrl":null,"url":null,"abstract":"• Markov chains are random, memoryless stochastic processes with a discrete state space. In essence, it is a system which changes states based on given probabilities, andthose probabilities depend only on the current state, not past states. • For example, you’re shopping at three stores. Each hour, you can decide to stay at the store, or go to a new one. If you are at any store, there is a 20% chance you stay at the store, or a 40% chance you change stores.","PeriodicalId":159329,"journal":{"name":"Introduction to the Mathematics of Operations Research with Mathematica®","volume":"31 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-10-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Introduction to the Mathematics of Operations Research with Mathematica®","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1201/9781315275987-4","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
• Markov chains are random, memoryless stochastic processes with a discrete state space. In essence, it is a system which changes states based on given probabilities, andthose probabilities depend only on the current state, not past states. • For example, you’re shopping at three stores. Each hour, you can decide to stay at the store, or go to a new one. If you are at any store, there is a 20% chance you stay at the store, or a 40% chance you change stores.