{"title":"Governing invisibility in the platform economy: excavating the logics of platform care","authors":"Vicky Kluzik","doi":"10.14763/2022.1.1636","DOIUrl":"https://doi.org/10.14763/2022.1.1636","url":null,"abstract":"","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"12 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117097536","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Visibility layers: a framework for systematising the gender gap in Wikipedia content","authors":"Pablo Beytía, Claudia Wagner","doi":"10.14763/2022.1.1623","DOIUrl":"https://doi.org/10.14763/2022.1.1623","url":null,"abstract":"","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"1 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"130679124","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"\"Doing gender\" by sharing: examining the gender gap in the European sharing economy","authors":"T. Eichhorn, C. Hoffmann, Katharina Heger","doi":"10.14763/2022.1.1627","DOIUrl":"https://doi.org/10.14763/2022.1.1627","url":null,"abstract":"","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"2 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"117332671","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Paula Rodríguez-Modroño, Annarosa Pesole, Purificación López-Igual
{"title":"Assessing gender inequality in digital labour platforms in Europe","authors":"Paula Rodríguez-Modroño, Annarosa Pesole, Purificación López-Igual","doi":"10.14763/2022.1.1622","DOIUrl":"https://doi.org/10.14763/2022.1.1622","url":null,"abstract":"","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"28 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121573208","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Addressing gendered affordances of the platform economy: the case of UpWork","authors":"Elisabetta Stringhi","doi":"10.14763/2022.1.1634","DOIUrl":"https://doi.org/10.14763/2022.1.1634","url":null,"abstract":": This study investigates UpWork affordances and their implications for female freelancers experiencing different forms of cyberviolence. Building up on a theoretical framework to situate the concept of affordances, gendered affordances and cyberviolence within a platform economy context, I use UpWork as a relevant case study to assess how online platforms that intermediate labour transactions present gendered affordances contributing to cyberviolence against women. I analysed the discussions of female users and freelancers in UpWork in line with the digital methods approach, by conducting a qualitative digital ethnographic analysis. These discussions serve as a foundation for a subsequent critical analysis of UpWork terms of service, to gain a wider understanding of how the digital platform controls information flows and models interactions between different categories of users. The findings suggest that UpWork affordances are gendered affordances, as they allow male users different conducts, as opposed to female freelancers, entrepreneurs, or users. I conclude that, while UpWork core features are allegedly neutral, they enable gendered affordances widening the gender gap in digital market transactions by facilitating the occurrence of cyber violence against women.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"8 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-03-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"134431280","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Artificial emotional intelligence beyond East and West","authors":"Daniel White, H. Katsuno","doi":"10.14763/2022.1.1618","DOIUrl":"https://doi.org/10.14763/2022.1.1618","url":null,"abstract":"Artificial emotional intelligence refers to technologies that perform, recognise, or record affective states. More than merely a technological function, however, it is also a social process whereby cultural assumptions about what emotions are and how they are made are translated into composites of code, software, and mechanical platforms that operationalise certain models of emotion over others. This essay illustrates how aspects of cultural difference are both incorporated and elided in projects that equip machines with emotional intelligence. It does so by comparing the field of affective computing, which emerged in the North-Atlantic in the 1990s, with kansei (affective) engineering, which developed in Japan in the 1980s. It then leverages this comparison to argue for more diverse applications of the culture concept in both the development and critique of systems with artificial emotional intelligence. Issue 1 This article belongs to Concepts of the digital society, a special section of Internet Policy Review guest-edited by Christian Katzenbach and Thomas Christian Bächle.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"19 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-02-11","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126350017","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Mixed traditions: evaluating telecommunications transparency","authors":"Ben Ballard, C. Parsons","doi":"10.14763/2022.1.1613","DOIUrl":"https://doi.org/10.14763/2022.1.1613","url":null,"abstract":": This article draws upon academic and civil society literatures to create a framework for assessing the effectiveness of telecommunications transparency reports on government requests for information within Canada, the United Kingdom, and the United States. Our analysis suggests that effective reports are targeted, in that they embody both verifiable and performative approaches to transparency, and also are sustainable, insofar as they evolve in their scope and structure while remaining regularly published. Emergent from this evaluation, we can better explain why different companies, in different jurisdictions, demonstrate variation in their adoption of effective transparency reporting practices over the last decade.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"32 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2022-01-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132587520","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Critical questions for Facebook's virtual reality: data, power and the metaverse","authors":"Ben Egliston, M. Carter","doi":"10.14763/2021.4.1610","DOIUrl":"https://doi.org/10.14763/2021.4.1610","url":null,"abstract":": Virtual Reality (VR) represents an emerging class of spatial computing technology reliant upon the capture and processing of data about the user (such as their body and its interface with the hardware), or their surrounding environment. Much like digital media more generally, there are growing concerns of who stands to benefit from VR as a data-intensive form of technology, and where its potential data-borne harms may lie. Drawing from critical data studies, we examine the case of Facebook’s Oculus VR—a market leading VR technology, central to their metaverse ambitions. Through this case, we argue that VR as a data-intensive device is not one of unalloyed benefit, but one fraught with power inequity—one that has the potential to exacerbate wealth inequity, institute algorithmic bias, and bring about new forms of digital exclusion. We contend that policy to date has had limited engagement with VR, and that regulatory intervention will be needed as VR becomes more widely adopted in society.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"25 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-20","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"127996031","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Programming the machine: gender, race, sexuality, AI, and the construction of credibility and deceit a t the border","authors":"Lucy Hall","doi":"10.14763/2021.4.1601","DOIUrl":"https://doi.org/10.14763/2021.4.1601","url":null,"abstract":"There is increasing recognition of the significance of the political, social, economic, and strategic effects of artificial intelligence (AI). This raises important ethical questions regarding the programming, use, and regulation of AI. This paper argues that both the programming and application of AI are inherently (cis)gendered, sexualised and racialised. AI is, after all, programmed by humans and the issue of who trains AI, teaches it to learn, and the ethics of doing so are therefore critical to avoiding the reproduction of (cis)gendered and racist stereotypes. The paper’s empirical focus is the EU-funded project iBorderCtrl, designed to manage security risks and enhance the speed of border crossings for third country nationals via the implementation of several AI-based technologies, including facial recognition and deception detection. By drawing together literature from 1) risk and security 2) AI and ethics/migration/asylum and 3) race, gender, (in)security, and AI, this paper explores the implications of lie detection for both regular border crossings and refugee protection with a conceptual focus on the intersections of gender, sexuality, and race. We argue here that AI border technologies such as iBorderCtrl pose a significant risk of both further marginalising and discriminating against LGBT persons, persons of colour, and asylum seekers and reinforcing existing non entree practices and policies. Issue 4 This paper is part of Feminist data protection, a special issue of Internet Policy Review guest-edited by Jens T. Theilen, Andreas Baur, Felix Bieker, Regina Ammicht Quinn, Marit Hansen, and Gloria González Fuster.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"33 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-12-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"129985620","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}