{"title":"管理基于图像的性虐待:数字平台政策、工具和实践","authors":"N. Henry, Alice Witt","doi":"10.1108/978-1-83982-848-520211054","DOIUrl":null,"url":null,"abstract":"The nonconsensual taking or sharing of nude or sexual images, also known as “ image-based sexual abuse, ” is a major social and legal problem in the digital age. In this chapter, we examine the problem of image-based sexual abuse in the context of digital platform governance. Speci fi cally, we focus on two key governance issues: fi rst, the governance of platforms, including the regulatory frameworks that apply to technology companies; and second, the governance by platforms, focusing on their policies, tools, and practices for responding to image-based sexual abuse. After analyzing the policies and practices of a range of digital platforms, we identify four overarching shortcomings: (1) inconsistent, reductionist, and ambiguous language; (2) a stark gap between the policy and practice of content regulation, including transparency de fi cits; (3) imperfect technology for detecting abuse; and (4) the responsibilization of users to report and prevent abuse. Drawing on a model of corporate social responsibility (CSR), we argue that until platforms better address these problems, they risk failing victim-survivors of image-based sexual abuse and are implicated in the perpetration of such abuse. We conclude by calling for reasonable and proportionate state-based regulation that can help to better align governance by platforms with CSR-initiatives.","PeriodicalId":117534,"journal":{"name":"The Emerald International Handbook of Technology Facilitated Violence and Abuse","volume":"1 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2021-06-04","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"3","resultStr":"{\"title\":\"Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices\",\"authors\":\"N. Henry, Alice Witt\",\"doi\":\"10.1108/978-1-83982-848-520211054\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"The nonconsensual taking or sharing of nude or sexual images, also known as “ image-based sexual abuse, ” is a major social and legal problem in the digital age. In this chapter, we examine the problem of image-based sexual abuse in the context of digital platform governance. Speci fi cally, we focus on two key governance issues: fi rst, the governance of platforms, including the regulatory frameworks that apply to technology companies; and second, the governance by platforms, focusing on their policies, tools, and practices for responding to image-based sexual abuse. After analyzing the policies and practices of a range of digital platforms, we identify four overarching shortcomings: (1) inconsistent, reductionist, and ambiguous language; (2) a stark gap between the policy and practice of content regulation, including transparency de fi cits; (3) imperfect technology for detecting abuse; and (4) the responsibilization of users to report and prevent abuse. Drawing on a model of corporate social responsibility (CSR), we argue that until platforms better address these problems, they risk failing victim-survivors of image-based sexual abuse and are implicated in the perpetration of such abuse. We conclude by calling for reasonable and proportionate state-based regulation that can help to better align governance by platforms with CSR-initiatives.\",\"PeriodicalId\":117534,\"journal\":{\"name\":\"The Emerald International Handbook of Technology Facilitated Violence and Abuse\",\"volume\":\"1 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2021-06-04\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"3\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"The Emerald International Handbook of Technology Facilitated Violence and Abuse\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.1108/978-1-83982-848-520211054\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"The Emerald International Handbook of Technology Facilitated Violence and Abuse","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.1108/978-1-83982-848-520211054","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Governing Image-Based Sexual Abuse: Digital Platform Policies, Tools, and Practices
The nonconsensual taking or sharing of nude or sexual images, also known as “ image-based sexual abuse, ” is a major social and legal problem in the digital age. In this chapter, we examine the problem of image-based sexual abuse in the context of digital platform governance. Speci fi cally, we focus on two key governance issues: fi rst, the governance of platforms, including the regulatory frameworks that apply to technology companies; and second, the governance by platforms, focusing on their policies, tools, and practices for responding to image-based sexual abuse. After analyzing the policies and practices of a range of digital platforms, we identify four overarching shortcomings: (1) inconsistent, reductionist, and ambiguous language; (2) a stark gap between the policy and practice of content regulation, including transparency de fi cits; (3) imperfect technology for detecting abuse; and (4) the responsibilization of users to report and prevent abuse. Drawing on a model of corporate social responsibility (CSR), we argue that until platforms better address these problems, they risk failing victim-survivors of image-based sexual abuse and are implicated in the perpetration of such abuse. We conclude by calling for reasonable and proportionate state-based regulation that can help to better align governance by platforms with CSR-initiatives.