{"title":"理解内容审核系统:在规模、时间和跨平台上理解互联网治理的新方法","authors":"Nicolas Suzor","doi":"10.4337/9781788977456.00013","DOIUrl":null,"url":null,"abstract":"There is increasing global concern about how the decisions of internet and telecommunications companies impact on human rights. As a key priority, if we care about how intermediaries govern their networks, we need to be able to measure their impact on human rights and work out how we can use this information to help protect them from external pressures that would limit our freedom and how we can hold them accountable for decisions they make on their own initiatives. Understanding the effects that technology companies have on our lives and identifying potential biases and other problems requires careful attention to the inputs and outputs of these systems and how they actually work in different social contexts. Analysis of this type will require large-scale access to data on individual decisions as well as deep qualitative analyses of the automated and human processes that platforms deploy internally. This chapter presents the ‘Platform Governance Observatory’: new research infrastructure that was designed to enable the systematic study of content moderation practices on major social media platforms. The core research question that made this infrastructure necessary was to understand, at a large scale, what social media content is moderated and by whom, how this compares between platforms, and how this changes over time. So far, this infrastructure enables the analysis of content removals of public posts on YouTube, Twitter, and Instagram. The infrastructure I created to support this exploration proceeds on a general design principle of building a random sample of public social media content, and then tests the availability of that sample at a later date.","PeriodicalId":145445,"journal":{"name":"Computational Legal Studies","volume":"34 1","pages":"0"},"PeriodicalIF":0.0000,"publicationDate":"2020-09-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":"5","resultStr":"{\"title\":\"Understanding content moderation systems: new methods to understand internet governance at scale, over time, and across platforms\",\"authors\":\"Nicolas Suzor\",\"doi\":\"10.4337/9781788977456.00013\",\"DOIUrl\":null,\"url\":null,\"abstract\":\"There is increasing global concern about how the decisions of internet and telecommunications companies impact on human rights. As a key priority, if we care about how intermediaries govern their networks, we need to be able to measure their impact on human rights and work out how we can use this information to help protect them from external pressures that would limit our freedom and how we can hold them accountable for decisions they make on their own initiatives. Understanding the effects that technology companies have on our lives and identifying potential biases and other problems requires careful attention to the inputs and outputs of these systems and how they actually work in different social contexts. Analysis of this type will require large-scale access to data on individual decisions as well as deep qualitative analyses of the automated and human processes that platforms deploy internally. This chapter presents the ‘Platform Governance Observatory’: new research infrastructure that was designed to enable the systematic study of content moderation practices on major social media platforms. The core research question that made this infrastructure necessary was to understand, at a large scale, what social media content is moderated and by whom, how this compares between platforms, and how this changes over time. So far, this infrastructure enables the analysis of content removals of public posts on YouTube, Twitter, and Instagram. The infrastructure I created to support this exploration proceeds on a general design principle of building a random sample of public social media content, and then tests the availability of that sample at a later date.\",\"PeriodicalId\":145445,\"journal\":{\"name\":\"Computational Legal Studies\",\"volume\":\"34 1\",\"pages\":\"0\"},\"PeriodicalIF\":0.0000,\"publicationDate\":\"2020-09-17\",\"publicationTypes\":\"Journal Article\",\"fieldsOfStudy\":null,\"isOpenAccess\":false,\"openAccessPdf\":\"\",\"citationCount\":\"5\",\"resultStr\":null,\"platform\":\"Semanticscholar\",\"paperid\":null,\"PeriodicalName\":\"Computational Legal Studies\",\"FirstCategoryId\":\"1085\",\"ListUrlMain\":\"https://doi.org/10.4337/9781788977456.00013\",\"RegionNum\":0,\"RegionCategory\":null,\"ArticlePicture\":[],\"TitleCN\":null,\"AbstractTextCN\":null,\"PMCID\":null,\"EPubDate\":\"\",\"PubModel\":\"\",\"JCR\":\"\",\"JCRName\":\"\",\"Score\":null,\"Total\":0}","platform":"Semanticscholar","paperid":null,"PeriodicalName":"Computational Legal Studies","FirstCategoryId":"1085","ListUrlMain":"https://doi.org/10.4337/9781788977456.00013","RegionNum":0,"RegionCategory":null,"ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":null,"EPubDate":"","PubModel":"","JCR":"","JCRName":"","Score":null,"Total":0}
Understanding content moderation systems: new methods to understand internet governance at scale, over time, and across platforms
There is increasing global concern about how the decisions of internet and telecommunications companies impact on human rights. As a key priority, if we care about how intermediaries govern their networks, we need to be able to measure their impact on human rights and work out how we can use this information to help protect them from external pressures that would limit our freedom and how we can hold them accountable for decisions they make on their own initiatives. Understanding the effects that technology companies have on our lives and identifying potential biases and other problems requires careful attention to the inputs and outputs of these systems and how they actually work in different social contexts. Analysis of this type will require large-scale access to data on individual decisions as well as deep qualitative analyses of the automated and human processes that platforms deploy internally. This chapter presents the ‘Platform Governance Observatory’: new research infrastructure that was designed to enable the systematic study of content moderation practices on major social media platforms. The core research question that made this infrastructure necessary was to understand, at a large scale, what social media content is moderated and by whom, how this compares between platforms, and how this changes over time. So far, this infrastructure enables the analysis of content removals of public posts on YouTube, Twitter, and Instagram. The infrastructure I created to support this exploration proceeds on a general design principle of building a random sample of public social media content, and then tests the availability of that sample at a later date.