Neutrality and Algorithms in Libraries

T. Lamanna
{"title":"Neutrality and Algorithms in Libraries","authors":"T. Lamanna","doi":"10.5860/JIFP.V3I2-3.6860","DOIUrl":null,"url":null,"abstract":"A recently published book, Safiya Noble’s 2018’s Algorithms of Oppression, has become an extremely popular read in our field as of late. While the book highlights some very important information about how our digital architecture de facto marginalizes people, it offers few remedies, other than expressing concerns about humans’ control of how the algorithm is built, thus influencing how it works. The book details how we must admit that our algorithms are human-generated, but does little to explain how this situation can be remedied beyond “fixing the algorithms.” Algorithms cannot be neutral, nor should they be; they are created by people and thus inherit the biases, conscious or unconscious, of their creators. No human has the capacity to be unbiased, so no algorithm can be. If they were, they could easily be gamed by malicious actors who would try to skew results. They need to be constantly worked and massaged to make sure they are behaving in a positive and progressive direction.","PeriodicalId":422726,"journal":{"name":"Journal of Intellectual Freedom & Privacy","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2019-01-15","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Intellectual Freedom & Privacy","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.5860/JIFP.V3I2-3.6860","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

A recently published book, Safiya Noble’s 2018’s Algorithms of Oppression, has become an extremely popular read in our field as of late. While the book highlights some very important information about how our digital architecture de facto marginalizes people, it offers few remedies, other than expressing concerns about humans’ control of how the algorithm is built, thus influencing how it works. The book details how we must admit that our algorithms are human-generated, but does little to explain how this situation can be remedied beyond “fixing the algorithms.” Algorithms cannot be neutral, nor should they be; they are created by people and thus inherit the biases, conscious or unconscious, of their creators. No human has the capacity to be unbiased, so no algorithm can be. If they were, they could easily be gamed by malicious actors who would try to skew results. They need to be constantly worked and massaged to make sure they are behaving in a positive and progressive direction.
图书馆中的中立性和算法
最近出版的一本书,萨菲亚·诺布尔的《2018年的压迫算法》,最近在我们的领域已经成为一个非常受欢迎的读物。虽然这本书强调了一些非常重要的信息,说明我们的数字架构实际上是如何边缘化人类的,但除了表达了人类对算法构建方式的控制,从而影响其工作方式的担忧之外,它几乎没有提供补救措施。这本书详细说明了我们如何必须承认我们的算法是人为产生的,但几乎没有解释除了“修复算法”之外如何补救这种情况。算法不能中立,也不应该中立;它们是由人创造的,因此继承了创造者有意识或无意识的偏见。没有人有能力做到不偏不倚,所以没有算法可以做到。如果他们是,他们很容易被恶意行为者玩弄,试图扭曲结果。他们需要不断地工作和按摩,以确保他们的行为朝着积极和进步的方向发展。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:604180095
Book学术官方微信