通过公平信用报告法案(FCRA)实现算法透明度

Karl Schmeckpeper, Sonia F. Roberts, M. Ouellet, Matthew Malencia, Divya Jain, Walker Gosrich, Val Bromberg
{"title":"通过公平信用报告法案(FCRA)实现算法透明度","authors":"Karl Schmeckpeper, Sonia F. Roberts, M. Ouellet, Matthew Malencia, Divya Jain, Walker Gosrich, Val Bromberg","doi":"10.38126/jspg180415","DOIUrl":null,"url":null,"abstract":"Racial discrimination in housing has long fueled disparities in homeownership and wealth in the United States. Now, automated algorithms play a dominant role in rental and lending decisions. Advocates of these technologies argue that mortgage lending algorithms reduce discrimination. However, “errors in background check reports persist and remain pervasive,” and algorithms are at risk for inheriting prejudices from society and reflect pre-existing patterns of inequality. Additionally, algorithmic discrimination is often challenging to identify and difficult to explain or prosecute in court. While the Federal Trade Commission (FTC) is responsible for prosecuting this type of discrimination under the Fair Credit Reporting Act (FCRA), their enforcement regime “has inadequately regulated industry at the federal and state level and failed to provide consumers access to justice at an individual level,” as evidenced by its mere eighty-seven enforcement actions in the past forty years. In comparison, 4,531 lawsuits have been brought under the FCRA by other groups in 2018 alone. Therefore, the FTC must update its policies to ensure it can identify, prosecute, and facilitate third-party lawsuits against a primary driver of housing discrimination in the 21st century: discrimination within algorithmic decision making. We recommend that the FTC issue a rule requiring companies to publish a data plan with all consumer reporting products. Currently, the FTC recommends that companies make an internal assessment of the components of the proposed data plan to ensure that they are not in violation of the FCRA. Therefore, requiring that these plans be published publicly does not place undue burden on companies and empowers consumers to advocate for themselves and report unfair practices to the FTC. Coupled together, these will reduce the costs of investigation and enforcement by the FTC and decrease the discriminatory impact of automated decision systems on marginalized communities.","PeriodicalId":227854,"journal":{"name":"Intersectional Science Policy","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-09-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":"{\"title\":\"Algorithm Transparency through the Fair Credit Reporting Act (FCRA)\",\"authors\":\"Karl Schmeckpeper, Sonia F. Roberts, M. Ouellet, Matthew Malencia, Divya Jain, Walker Gosrich, Val Bromberg\",\"doi\":\"10.38126/jspg180415\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"Racial discrimination in housing has long fueled disparities in homeownership and wealth in the United States. Now, automated algorithms play a dominant role in rental and lending decisions. Advocates of these technologies argue that mortgage lending algorithms reduce discrimination. However, “errors in background check reports persist and remain pervasive,” and algorithms are at risk for inheriting prejudices from society and reflect pre-existing patterns of inequality. Additionally, algorithmic discrimination is often challenging to identify and difficult to explain or prosecute in court. While the Federal Trade Commission (FTC) is responsible for prosecuting this type of discrimination under the Fair Credit Reporting Act (FCRA), their enforcement regime “has inadequately regulated industry at the federal and state level and failed to provide consumers access to justice at an individual level,” as evidenced by its mere eighty-seven enforcement actions in the past forty years. In comparison, 4,531 lawsuits have been brought under the FCRA by other groups in 2018 alone. Therefore, the FTC must update its policies to ensure it can identify, prosecute, and facilitate third-party lawsuits against a primary driver of housing discrimination in the 21st century: discrimination within algorithmic decision making. We recommend that the FTC issue a rule requiring companies to publish a data plan with all consumer reporting products. Currently, the FTC recommends that companies make an internal assessment of the components of the proposed data plan to ensure that they are not in violation of the FCRA. Therefore, requiring that these plans be published publicly does not place undue burden on companies and empowers consumers to advocate for themselves and report unfair practices to the FTC. Coupled together, these will reduce the costs of investigation and enforcement by the FTC and decrease the discriminatory impact of automated decision systems on marginalized communities.\",\"PeriodicalId\":227854,\"journal\":{\"name\":\"Intersectional Science Policy\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-09-27\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"0\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Intersectional Science Policy\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.38126/jspg180415\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Intersectional Science Policy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.38126/jspg180415","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

摘要

长期以来,住房方面的种族歧视加剧了美国住房拥有率和财富的差距。现在,自动化算法在租赁和借贷决策中起着主导作用。这些技术的支持者认为,抵押贷款算法减少了歧视。然而,“背景调查报告中的错误仍然存在,而且仍然普遍存在”,算法有可能继承社会偏见,并反映出先前存在的不平等模式。此外,算法歧视往往难以识别,难以解释或在法庭上起诉。虽然联邦贸易委员会(FTC)负责根据《公平信用报告法》(FCRA)起诉这类歧视,但他们的执法制度“在联邦和州一级对行业监管不力,未能为消费者提供个人层面的司法救助”,过去40年里仅有87次执法行动就是明证。相比之下,仅在2018年,其他团体就在FCRA下提起了4531起诉讼。因此,联邦贸易委员会必须更新其政策,以确保它能够识别、起诉和促进第三方诉讼,以反对21世纪住房歧视的主要驱动因素:算法决策中的歧视。我们建议联邦贸易委员会发布一项规定,要求公司公布所有消费者报告产品的数据计划。目前,联邦贸易委员会建议公司对拟议数据计划的组成部分进行内部评估,以确保它们不违反FCRA。因此,要求公开发布这些计划不会给公司带来不必要的负担,并赋予消费者权利,为自己辩护,向联邦贸易委员会报告不公平的做法。结合起来,这些将减少联邦贸易委员会的调查和执法成本,并减少自动化决策系统对边缘化社区的歧视性影响。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
Algorithm Transparency through the Fair Credit Reporting Act (FCRA)
Racial discrimination in housing has long fueled disparities in homeownership and wealth in the United States. Now, automated algorithms play a dominant role in rental and lending decisions. Advocates of these technologies argue that mortgage lending algorithms reduce discrimination. However, “errors in background check reports persist and remain pervasive,” and algorithms are at risk for inheriting prejudices from society and reflect pre-existing patterns of inequality. Additionally, algorithmic discrimination is often challenging to identify and difficult to explain or prosecute in court. While the Federal Trade Commission (FTC) is responsible for prosecuting this type of discrimination under the Fair Credit Reporting Act (FCRA), their enforcement regime “has inadequately regulated industry at the federal and state level and failed to provide consumers access to justice at an individual level,” as evidenced by its mere eighty-seven enforcement actions in the past forty years. In comparison, 4,531 lawsuits have been brought under the FCRA by other groups in 2018 alone. Therefore, the FTC must update its policies to ensure it can identify, prosecute, and facilitate third-party lawsuits against a primary driver of housing discrimination in the 21st century: discrimination within algorithmic decision making. We recommend that the FTC issue a rule requiring companies to publish a data plan with all consumer reporting products. Currently, the FTC recommends that companies make an internal assessment of the components of the proposed data plan to ensure that they are not in violation of the FCRA. Therefore, requiring that these plans be published publicly does not place undue burden on companies and empowers consumers to advocate for themselves and report unfair practices to the FTC. Coupled together, these will reduce the costs of investigation and enforcement by the FTC and decrease the discriminatory impact of automated decision systems on marginalized communities.
求助全文
通过发布文献求助,成功后即可免费获取论文全文。 去求助
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信