{"title":"Adaptive jitter buffer management: a game theoretic approach","authors":"P. Chandran, Chelpa Lingam","doi":"10.1504/IJCNDS.2018.10011127","DOIUrl":null,"url":null,"abstract":"Network impairments like delay and jitter significantly affect the quality of voice over internet protocol (VoIP) calls. The main challenge is the synchronous playout of voice packets at the receiver side in face of varying jitter. This is typically solved using a jitter buffer so that the playout time of received packets can be delayed to accommodate late packets. In this paper, we propose a game theoretic approach to solve the trade-off between additional buffer delay and buffer scaling. We formulate the model as a zero-sum game between players, i.e., the playout scheduler and buffer manager, to adaptively adjust buffer size according to the buffer delay of the packets and is solved using uncertainty principle of game theory. Experimental results show that the proposed model allocates optimal buffer space with lowest late loss rate when compared with other algorithms.","PeriodicalId":209177,"journal":{"name":"Int. J. Commun. Networks Distributed Syst.","volume":"116 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2018-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Int. J. Commun. Networks Distributed Syst.","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1504/IJCNDS.2018.10011127","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Network impairments like delay and jitter significantly affect the quality of voice over internet protocol (VoIP) calls. The main challenge is the synchronous playout of voice packets at the receiver side in face of varying jitter. This is typically solved using a jitter buffer so that the playout time of received packets can be delayed to accommodate late packets. In this paper, we propose a game theoretic approach to solve the trade-off between additional buffer delay and buffer scaling. We formulate the model as a zero-sum game between players, i.e., the playout scheduler and buffer manager, to adaptively adjust buffer size according to the buffer delay of the packets and is solved using uncertainty principle of game theory. Experimental results show that the proposed model allocates optimal buffer space with lowest late loss rate when compared with other algorithms.