{"title":"Online Learning Methods for Networking","authors":"Cem Tekin, M. Liu","doi":"10.1561/1300000050","DOIUrl":null,"url":null,"abstract":"In this monograph we provided a tutorial on a family of sequential learning and decision problems known as the multi-armed bandit problems. We introduced a wide range of application scenarios for this learning framework, as well as its many different variants. The more detailed discussion has focused more on the stochastic bandit problems, with rewards driven by either an IID or a Markovian process, and when the environment consists of a single or multiple simultaneous users. We also presented literature on learning of MDPs, which captures coupling among the evolution of different options that a classical MAB problem does not.","PeriodicalId":188056,"journal":{"name":"Found. Trends Netw.","volume":"64 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2014-12-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"22","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Found. Trends Netw.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1561/1300000050","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 22
Abstract
In this monograph we provided a tutorial on a family of sequential learning and decision problems known as the multi-armed bandit problems. We introduced a wide range of application scenarios for this learning framework, as well as its many different variants. The more detailed discussion has focused more on the stochastic bandit problems, with rewards driven by either an IID or a Markovian process, and when the environment consists of a single or multiple simultaneous users. We also presented literature on learning of MDPs, which captures coupling among the evolution of different options that a classical MAB problem does not.