{"title":"Fully Dynamic Maximal Matching in O (log n) Update Time","authors":"Surender Baswana, Manoj Gupta, Sandeep Sen","doi":"10.1137/130914140","DOIUrl":null,"url":null,"abstract":"We present an algorithm for maintaining maximal matching in a graph under addition and deletion of edges. Our data structure is randomized that takes $O( \\log n)$ expected amortized time for each edge update where $n$ is the number of vertices in the graph. While there is a trivial $O(n)$ algorithm for edge update, the previous best known result for this problem was due to Ivkovi\\'c and Llyod\\cite{llyod}. For a graph with $n$ vertices and $m$ edges, they give an $O( {(n+ m)}^{0.7072})$ update time algorithm which is sub linear only for a sparse graph. %To the best of our knowledge this %is the first polylog update time for maximal matching that implies an % exponential improvement from the previous results. For the related problem of maximum matching, Onak and Rubinfeld \\cite{onak} designed a randomized data structure that achieves $O(\\log^2 n)$ expected amortized time for each update for maintaining a $c$-approximate maximum matching for some large constant $c$. In contrast, we can maintain a factor two approximate maximum matching in $O(\\log n )$ expected amortized time per update as a direct corollary of the maximal matching scheme. This in turn also implies a two approximate vertex cover maintenance scheme that takes $O(\\log n )$expected amortized time per update.","PeriodicalId":326048,"journal":{"name":"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science","volume":"5 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2011-03-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"149","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"2011 IEEE 52nd Annual Symposium on Foundations of Computer Science","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1137/130914140","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 149
Abstract
We present an algorithm for maintaining maximal matching in a graph under addition and deletion of edges. Our data structure is randomized that takes $O( \log n)$ expected amortized time for each edge update where $n$ is the number of vertices in the graph. While there is a trivial $O(n)$ algorithm for edge update, the previous best known result for this problem was due to Ivkovi\'c and Llyod\cite{llyod}. For a graph with $n$ vertices and $m$ edges, they give an $O( {(n+ m)}^{0.7072})$ update time algorithm which is sub linear only for a sparse graph. %To the best of our knowledge this %is the first polylog update time for maximal matching that implies an % exponential improvement from the previous results. For the related problem of maximum matching, Onak and Rubinfeld \cite{onak} designed a randomized data structure that achieves $O(\log^2 n)$ expected amortized time for each update for maintaining a $c$-approximate maximum matching for some large constant $c$. In contrast, we can maintain a factor two approximate maximum matching in $O(\log n )$ expected amortized time per update as a direct corollary of the maximal matching scheme. This in turn also implies a two approximate vertex cover maintenance scheme that takes $O(\log n )$expected amortized time per update.
提出了一种图在加边和删边情况下保持最大匹配的算法。我们的数据结构是随机化的,每次边缘更新需要$O( \log n)$预期平摊时间,其中$n$是图中顶点的数量。虽然有一个微不足道的$O(n)$边缘更新算法,但这个问题之前最著名的结果是由于ivkovovic和lloyd \cite{llyod}。对于具有$n$顶点和$m$边的图,他们给出了$O( {(n+ m)}^{0.7072})$更新时间算法,该算法仅对稀疏图是次线性的。 %To the best of our knowledge this %is the first polylog update time for maximal matching that implies an % exponential improvement from the previous results. For the related problem of maximum matching, Onak and Rubinfeld \cite{onak} designed a randomized data structure that achieves $O(\log^2 n)$ expected amortized time for each update for maintaining a $c$-approximate maximum matching for some large constant $c$. In contrast, we can maintain a factor two approximate maximum matching in $O(\log n )$ expected amortized time per update as a direct corollary of the maximal matching scheme. This in turn also implies a two approximate vertex cover maintenance scheme that takes $O(\log n )$expected amortized time per update.