{"title":"Understanding and Mitigating the Impacts of Differentially Private Census Data on State Level Redistricting","authors":"Christian Cianfarani, Aloni Cohen","doi":"arxiv-2409.06801","DOIUrl":null,"url":null,"abstract":"Data from the Decennial Census is published only after applying a disclosure\navoidance system (DAS). Data users were shaken by the adoption of differential\nprivacy in the 2020 DAS, a radical departure from past methods. The change\nraises the question of whether redistricting law permits, forbids, or requires\ntaking account of the effect of disclosure avoidance. Such uncertainty creates\nlegal risks for redistricters, as Alabama argued in a lawsuit seeking to\nprevent the 2020 DAS's deployment. We consider two redistricting settings in\nwhich a data user might be concerned about the impacts of privacy preserving\nnoise: drawing equal population districts and litigating voting rights cases.\nWhat discrepancies arise if the user does nothing to account for disclosure\navoidance? How might the user adapt her analyses to mitigate those\ndiscrepancies? We study these questions by comparing the official 2010\nRedistricting Data to the 2010 Demonstration Data -- created using the 2020 DAS\n-- in an analysis of millions of algorithmically generated state legislative\nredistricting plans. In both settings, we observe that an analyst may come to\nincorrect conclusions if they do not account for noise. With minor adaptations,\nthough, the underlying policy goals remain achievable: tweaking selection\ncriteria enables a redistricter to draw balanced plans, and illustrative plans\ncan still be used as evidence of the maximum number of majority-minority\ndistricts that are possible in a geography. At least for state legislatures,\nAlabama's claim that differential privacy ``inhibits a State's right to draw\nfair lines'' appears unfounded.","PeriodicalId":501112,"journal":{"name":"arXiv - CS - Computers and Society","volume":"4 1","pages":""},"PeriodicalIF":0.0000,"publicationDate":"2024-09-10","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"arXiv - CS - Computers and Society","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/arxiv-2409.06801","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0
Abstract
Data from the Decennial Census is published only after applying a disclosure
avoidance system (DAS). Data users were shaken by the adoption of differential
privacy in the 2020 DAS, a radical departure from past methods. The change
raises the question of whether redistricting law permits, forbids, or requires
taking account of the effect of disclosure avoidance. Such uncertainty creates
legal risks for redistricters, as Alabama argued in a lawsuit seeking to
prevent the 2020 DAS's deployment. We consider two redistricting settings in
which a data user might be concerned about the impacts of privacy preserving
noise: drawing equal population districts and litigating voting rights cases.
What discrepancies arise if the user does nothing to account for disclosure
avoidance? How might the user adapt her analyses to mitigate those
discrepancies? We study these questions by comparing the official 2010
Redistricting Data to the 2010 Demonstration Data -- created using the 2020 DAS
-- in an analysis of millions of algorithmically generated state legislative
redistricting plans. In both settings, we observe that an analyst may come to
incorrect conclusions if they do not account for noise. With minor adaptations,
though, the underlying policy goals remain achievable: tweaking selection
criteria enables a redistricter to draw balanced plans, and illustrative plans
can still be used as evidence of the maximum number of majority-minority
districts that are possible in a geography. At least for state legislatures,
Alabama's claim that differential privacy ``inhibits a State's right to draw
fair lines'' appears unfounded.