Mapping the social implications of platform algorithms for LGBTQ+ communities

David Myles, Stefanie Duguay, Lucia Flores Echaiz
{"title":"Mapping the social implications of platform algorithms for LGBTQ+ communities","authors":"David Myles, Stefanie Duguay, Lucia Flores Echaiz","doi":"10.33621/jdsr.v5i4.162","DOIUrl":null,"url":null,"abstract":"LGBTQ+ communities were among the first to appropriate the Internet to experiment with their identities and socialize outside of mainstream society. Recently, those platforms have implemented algorithmic systems that curate, exploit, and predict user practices and identities. Yet, the social implications that platform algorithms raise for LGBTQ+ communities remain largely unexplored. Drawing from critical platform studies, science and technology studies, as well as gender and sexuality studies, this paper maps the main issues that platform algorithms raise for LGBTQ+ users and analyzes their implications for social justice and equity. To do so, it identifies and discusses public controversies through a review and analysis of journalistic articles. Our analysis points to five important algorithmic issues that affect the lives of LGBTQ+ users in ways that require additional scrutiny from researchers, policymakers, and tech developers alike: the ability for sorting algorithms to identify, categorize, and predict the sexual orientation and/or gender identity of users; the role that recommendation algorithms play in mediating LGBTQ+ identities, kinship, and cultures; the development of automated anti-LGBTQ+ speech detection/filtering software and the collateral harm caused to LGBTQ+ users; the power struggles over the nature and types of visibility afforded to LGBTQ+ issues online; and the overall enactment of cisheteronormative biases by platform affordances.","PeriodicalId":199704,"journal":{"name":"Journal of Digital Social Research","volume":"30 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2023-09-08","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"0","resultStr":null,"platform":"Semanticscholar","paperid":null,"PeriodicalName":"Journal of Digital Social Research","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.33621/jdsr.v5i4.162","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
引用次数: 0

Abstract

LGBTQ+ communities were among the first to appropriate the Internet to experiment with their identities and socialize outside of mainstream society. Recently, those platforms have implemented algorithmic systems that curate, exploit, and predict user practices and identities. Yet, the social implications that platform algorithms raise for LGBTQ+ communities remain largely unexplored. Drawing from critical platform studies, science and technology studies, as well as gender and sexuality studies, this paper maps the main issues that platform algorithms raise for LGBTQ+ users and analyzes their implications for social justice and equity. To do so, it identifies and discusses public controversies through a review and analysis of journalistic articles. Our analysis points to five important algorithmic issues that affect the lives of LGBTQ+ users in ways that require additional scrutiny from researchers, policymakers, and tech developers alike: the ability for sorting algorithms to identify, categorize, and predict the sexual orientation and/or gender identity of users; the role that recommendation algorithms play in mediating LGBTQ+ identities, kinship, and cultures; the development of automated anti-LGBTQ+ speech detection/filtering software and the collateral harm caused to LGBTQ+ users; the power struggles over the nature and types of visibility afforded to LGBTQ+ issues online; and the overall enactment of cisheteronormative biases by platform affordances.
绘制LGBTQ+社区平台算法的社会含义
LGBTQ+社区是最早利用互联网尝试自己的身份,并在主流社会之外进行社交的群体之一。最近,这些平台已经实施了算法系统来管理、利用和预测用户的行为和身份。然而,平台算法对LGBTQ+社区的社会影响在很大程度上仍未被探索。本文从关键的平台研究、科学技术研究以及性别和性研究中,绘制了平台算法为LGBTQ+用户带来的主要问题,并分析了它们对社会正义和公平的影响。为了做到这一点,它通过对新闻文章的回顾和分析来识别和讨论公众争议。我们的分析指出了五个重要的算法问题,它们会影响LGBTQ+用户的生活,需要研究人员、政策制定者和技术开发人员进行额外的审查:排序算法识别、分类和预测用户的性取向和/或性别认同的能力;推荐算法在调解LGBTQ+身份、亲属关系和文化方面的作用;反LGBTQ+语音自动检测/过滤软件的开发以及对LGBTQ+用户造成的附带伤害;关于LGBTQ+问题在网络上曝光的性质和类型的权力斗争;以及平台可视性对非异性恋规范偏见的总体制定。
本文章由计算机程序翻译,如有差异,请以英文原文为准。
求助全文
约1分钟内获得全文 求助全文
来源期刊
自引率
0.00%
发文量
0
×
引用
GB/T 7714-2015
复制
MLA
复制
APA
复制
导出至
BibTeX EndNote RefMan NoteFirst NoteExpress
×
提示
您的信息不完整,为了账户安全,请先补充。
现在去补充
×
提示
您因"违规操作"
具体请查看互助需知
我知道了
×
提示
确定
请完成安全验证×
copy
已复制链接
快去分享给好友吧!
我知道了
右上角分享
点击右上角分享
0
联系我们:info@booksci.cn Book学术提供免费学术资源搜索服务,方便国内外学者检索中英文文献。致力于提供最便捷和优质的服务体验。 Copyright © 2023 布克学术 All rights reserved.
京ICP备2023020795号-1
ghs 京公网安备 11010802042870号
Book学术文献互助
Book学术文献互助群
群 号:481959085
Book学术官方微信