{"title":"Introducing the glossary of decentralised technosocial systems","authors":"V. Ferrari","doi":"10.14763/2021.2.1546","DOIUrl":"https://doi.org/10.14763/2021.2.1546","url":null,"abstract":"Interdisciplinary glossary on peer-to-peer, user-centric and privacy-enhancing decentralised technologies Issue 2 FIRST GLOSSARY ENTRIES IN THIS SECTION arrow_downward EDITORIAL: Introducing the glossary of decentralised technosocial systems Valeria Ferrari, University of Amsterdam Reputation Primavera De Filippi, CNRS Ori Shimony, dOrg Antonio Tenorio-Fornés, Universidad Complutense de Madrid Digital scarcity Jaya Klara Brekke, Durham University Aron Fischer, Colony Smart contracts Primavera De Filippi, CNRS Chris Wray, Legal Graph Company Limited Giovanni Sileno, University of Amsterdam Self-sovereign identity Alexandra Giannopoulou, University of Amsterdam Fennie Wang, Dionysus Labs Mining Wassim Zuhair Alsindi, Massachusetts Institute of Technology Laura Lotti, Independent researcher Blockchain-based technologies María-Cruz Valiente, Universidad Complutense de Madrid Florian Tschorsch, Technical University Berlin Cryptoeconomics Jaya Klara Brekke, Durham University Wassim Zuhair Alsindi, Massachusetts Institute of Technology Blockchain governance Aron Fischer, Colony María-Cruz Valiente, Universidad Complutense de Madrid Trust in blockchain-based systems Moritz Becker, Weizenbaum Institute for the Networked Society Balázs Bodó, University of Amsterdam Decentralized Autonomous Organization Samer Hassan, Universidad Complutense de Madrid Primavera De Filippi, CNRS Decentralisation in the blockchain space Balázs Bodó, University of Amsterdam Jaya Klara Brekke, Durham University Jaap-Henk Hoepman, Radboud University Cryptocurrency 2 Internet Policy Review 10(2) | 2021","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"59 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-19","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115052473","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Civil legal personality of artificial intelligence. Future or utopia?","authors":"Karolina Ziemianin","doi":"10.14763/2021.2.1544","DOIUrl":"https://doi.org/10.14763/2021.2.1544","url":null,"abstract":": The technology associated with artificial intelligence is developing rapidly. As a consequence, artificial intelligence is being applied in many spheres of life and increasingly affects the functioning of society. Actions of artificial intelligence may cause harm (e.g. in the case of autonomous vehicles that cause traffic accidents). Rules of civil law, especially those relating to liability for damage resulting from somebody’s fault or risk, came into being before artificial intelligence's invention and mostly before the latter’s significant recent development. They include the Polish Code of civil procedure, which addresses the issues associated with liability, adopted in 1964 and still in force today, although with certain amendments. Therefore, no provisions that would directly refer to artificial intelligence and legal consequences of its actions have been introduced into the Polish civil law. The same applies to European law. Therefore, the issue of whether existing regulations may be applied in the case of artificial intelligence or, perhaps, whether they should be appropriately adjusted, needs to be analysed. The starting point for this analysis is the possibility of conferring upon artificial intelligence the status of an entity under the law, allowing it to independently bear the liability for the damage caused by it. This","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"34 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2021-04-07","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"123271868","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Platform developmentalism: leveraging platform innovation for national development in Latin America","authors":"K. Reilly","doi":"10.14763/2020.4.1538","DOIUrl":"https://doi.org/10.14763/2020.4.1538","url":null,"abstract":"","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"5 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"115240294","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A non-discrimination principle for rankings in app stores","authors":"Dennis Jeroen Brouwer","doi":"10.14763/2020.4.1539","DOIUrl":"https://doi.org/10.14763/2020.4.1539","url":null,"abstract":"The gatekeeper position of app store operators gives them the power to favour their own and the most popular applications in the rankings of search results. Based on parallels with the non-discrimination principle in the European Union's Regulation on Open Internet Access, this article formulates a list of permitted and forbidden ranking rationales for app store operators. Permitted ranking rationales include text relevance, price, quality and the legality of content. These rationales contrast with the forbidden ranking rationales, such as those based on self-favouring without objective justification and the popularity of applications, which potentially limit consumer choice and distort the digital level playing field. Issue 4","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"16 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-22","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121115408","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Privacy self-management and the issue of privacy externalities: of thwarted expectations, and harmful exploitation","authors":"S. Brouwer","doi":"10.14763/2020.4.1537","DOIUrl":"https://doi.org/10.14763/2020.4.1537","url":null,"abstract":"This article argues that the self-management of one's privacy is impossible due to privacy externalities. Privacy externalities are the negative by-product of the services offered by some data controllers, whereby the price to \"pay\" for a service includes not just the provision of the user's own personal data, but also that of others. This term, related to similar concepts from the literature on privacy such as \"networked privacy\" or \"data pollutio\", is used here to bring to light the incentives and exploitative dynamics behind a phenomenon which, I demonstrate, benefits both the user and the data controller to the detriment of third-party data subjects. Building on these novel elements and on the relevant concepts and examples found in the existing literature, this article draws a comprehensive picture of the phenomenon, and offers two promising paths to address it-better enforcing the principle of data protection by design and by default, and relying on the framework of joint controllership.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-21","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"126376459","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Towards platform observability","authors":"Bernhard Rieder, J. Hofmann","doi":"10.14763/2020.4.1535","DOIUrl":"https://doi.org/10.14763/2020.4.1535","url":null,"abstract":"The growing power of digital platforms raises the question of democratic control or at least containment. In light of the transforming impact of platforms on markets, the public sphere, elections, and employment conditions, governments, and civil society alike are demanding more transparency and accountability. Shedding light on the principles and practices of algorithmic ordering promises to limit the power of platforms by subjecting their hidden operations to regulatory inspection. This article questions the popular image of an openable ‘black box’. Based on a critical reflection on transparency as a panacea for curtailing platform power, we propose the concept of observability to deal more systematically with the problem of studying complex algorithmic systems. We set out three broad principles as regulatory guidelines for making platforms more accountable. These principles concern the normative and analytical scope, the empirical and temporal dimension, and the necessary capacities for learning and knowledge generation.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"365 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"132551436","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Algorithmic bias and the Value Sensitive Design approach","authors":"Judith Simon, Pak-Hang Wong, Gernot Rieder","doi":"10.14763/2020.4.1534","DOIUrl":"https://doi.org/10.14763/2020.4.1534","url":null,"abstract":"Recently, amid growing awareness that computer algorithms are not neutral tools but can cause harm by reproducing and amplifying bias, attempts to detect and prevent such biases have intensified. An approach that has received considerable attention in this regard is the Value Sensitive Design (VSD) methodology, which aims to contribute to both the critical analysis of (dis)values in existing technologies and the construction of novel technologies that account for specific desired values. This article provides a brief overview of the key features of the Value Sensitive Design approach, examines its contributions to understanding and addressing issues around bias in computer systems, outlines the current debates on algorithmic bias and fairness in machine learning, and discusses how such debates could profit from VSD-derived insights and recommendations. Relating these debates on values in design and algorithmic bias to research on cognitive biases, we conclude by stressing our collective duty to not only detect and counter biases in software systems, but to also address and remedy their societal origins. Issue 4 This article belongs to Concepts of the digital society, a special section of Internet Policy Review guest-edited by Christian Katzenbach and Thomas Christian Bächle.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"10 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-18","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"124316306","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Cybersecurity","authors":"Michael Veale, I. Brown","doi":"10.14763/2020.4.1533","DOIUrl":"https://doi.org/10.14763/2020.4.1533","url":null,"abstract":"Cite as Michael Veale and Ian Brown (2020) 'Cybersecurity' 9(4) Internet Policy Review Cybersecurity covers the broad range of technical and social issues that must be considered to protect networked information systems. The importance of the concept has increased as so many government, business, and day-to-day activities globally have moved online. It has been increasingly referred to in both academic and mainstream publications since 2003, in fields including software engineering, international relations, crisis management and public safety, slowly overtaking more technical terms such as computer/system/data security (popular in the 1970s/1980s) and information security (popular from the mid 1990s). But its strong association with national security and defence agencies, and disconnection from social science notions such as place, have led to concerns of inappropriate cyber securitisation of government programmes.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"13 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"121928878","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Digital sovereignty","authors":"Julia Pohle, Thorsten Thiel","doi":"10.14763/2020.4.1532","DOIUrl":"https://doi.org/10.14763/2020.4.1532","url":null,"abstract":": Over the last decade, digital sovereignty has become a central element in policy discourses on digital issues. Although it has become popular in both centralised/authoritarian and democratic countries alike, the concept remains highly contested. After investigating the challenges to sovereignty apparently posed by the digital transformation, this essay retraces how sovereignty has re-emerged as a key category with regard to the digital. By systematising the various normative claims to digital sovereignty, it then goes on to show how, today, the concept is understood more as a discursive practice in politics and policy than as a legal or organisational concept.","PeriodicalId":219999,"journal":{"name":"Internet Policy Rev.","volume":"69 1","pages":"0"},"PeriodicalIF":0.0,"publicationDate":"2020-12-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"128950293","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}