{"title":"Ensuring Privacy and Confidentiality in Social Work Through Intentional Omissions of Information in Client Information Systems: a Qualitative Study of Available and Non-available Data.","authors":"Diana Schneider","doi":"10.1007/s44206-022-00029-9","DOIUrl":"https://doi.org/10.1007/s44206-022-00029-9","url":null,"abstract":"<p><p>There is intensive debate about the use of AI-based systems in social work, although the degree of digitalization is low in many jurisdictions. In this article, the findings of research about the process of case recording in client information systems for social work are presented. Between January and June 2020, 20 guideline-based interviews were conducted with experts working for funding agencies or service providers. A significant finding was that there are intentional omissions of information within digital client records, despite being relevant for further case processing. This finding and the reasons for omission are highly relevant to the current debate on AI, because it extends consideration beyond the ethics of design to include the ethics of usage.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 3","pages":"26"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9676750/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10762167","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"AI Ethics, Ethics Washing, and the Need to Politicize Data Ethics.","authors":"Gijs van Maanen","doi":"10.1007/s44206-022-00013-3","DOIUrl":"https://doi.org/10.1007/s44206-022-00013-3","url":null,"abstract":"<p><p>Many commercial actors in the tech sector publish ethics guidelines as a means to 'wash away' concerns raised about their policies. For some academics, this phenomenon is reason to replace ethics with other tools and methods in an attempt to make sure that the tech sector does not cross any moral Rubicons. Others warn against the tendency to reduce a criticism of 'ethics washing' into one of ethics simpliciter. In this essay, I argue firstly that the dominant focus on principles, dilemmas, and theory in conventional ethical theories and practices could be an explanation of it lacking resistance to abuse by dominant actors, and hence its rather disappointing capacity to stop, redirect, or at least slow down big tech's course. Secondly, drawing from research on casuistry and political philosopher Raymond Geuss, this essay will make a case for a question, rather than theory or principle-based ethical data practice. The emphasis of this approach is placed on the acquisition of a thorough understanding of a social-political phenomenon like tech development. This approach should be replenished with one extra component to the picture of the repoliticized data ethics drawn so far: the importance of 'exemplars,' or stories. Precisely the fact that one should acquire an in-depth understanding of the problem in practice will also allow one to look in the past, present, or future for similar and comparable stories from which one can learn.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":" ","pages":"9"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9373109/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"40700492","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Value-centered Exploration of Data Privacy and Personalized Privacy Assistants.","authors":"Sarah E Carter","doi":"10.1007/s44206-022-00028-w","DOIUrl":"https://doi.org/10.1007/s44206-022-00028-w","url":null,"abstract":"<p><p>In the current post-GDPR landscape, privacy notices have become ever more prevalent on our phones and online. However, these notices are not well suited to their purpose of helping users make informed decisions. I suggest that instead of utilizing notice to elicit informed consent, we could repurpose privacy notices to create the space for more meaningful, value-centered user decisions. Value-centered privacy decisions, or those that accurately reflect who we are and what we value, encapsulate the intuitive role of personal values in data privacy decisions. To explore how we could design for such decisions, I utilize Suzy Killmister's Four-Dimensional Theory of Autonomy (4DT) to operationalize value-centered privacy decisions. I then utilize 4DT to help design a system-called a value-centered privacy assistant (VcPA)-that could help create the space for value-centered data privacy decisions using privacy notices. Using this 4DT lens, I further assess the degree that an existing technology, personalized privacy assistants (PPAs), use notices in a manner that allows for value-centered decision-making. I lastly utilize insights from the PPA assessment to inform the design of a VcPA, concluding that a VcPA could utilize notices to assist users in value-centered app selection and in other data privacy decisions.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 3","pages":"27"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9700617/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10418443","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Deepfakes and Democracy (Theory): How Synthetic Audio-Visual Media for Disinformation and Hate Speech Threaten Core Democratic Functions.","authors":"Maria Pawelec","doi":"10.1007/s44206-022-00010-6","DOIUrl":"https://doi.org/10.1007/s44206-022-00010-6","url":null,"abstract":"<p><p>Observers fear that deepfakes will shake the very foundations of democracy. Notwithstanding, in-depth scholarly analyses of deepfakes' political impact are rare, and do not consider theories of democracy. This contribution helps close this research gap, drawing on Warren's problem-oriented democracy theory, as well as theories of deliberative democracy and contributions on the role of trust in democracies. I identify three core functions of democratic systems and their normative foundations, namely empowered inclusion, collective agenda and will formation (supported by deliberation), and collective decision-making. Based on a literature and media analysis, I systematize different types of deepfakes serving either disinformation or hate speech and outline how they weaken core democratic functions and norms: Deepfakes impede citizens' empowered inclusion in debates and decisions that affect them, e.g. by hampering efforts to hold political representatives accountable or further marginalizing certain societal groups such as women or ethnic minorities. Deepfakes also undermine collective agenda and will formation by threatening the epistemic quality of deliberation as well as citizens' mutual empathy and respect. This culminates in a decreased legitimacy of collective decisions taken, which is additionally threatened by pervasive (but mostly speculative) fears of deepfake election manipulation. My analysis has implications for (future) governance efforts addressing deepfakes. Such efforts are increasing, e.g. on the part of social media platforms, but also (supra-)national regulatory bodies.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 2","pages":"19"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9453721/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"33463726","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Own the Unknown: An Anticipatory Approach to Prepare Society for the Quantum Age","authors":"Eline de Jong","doi":"10.1007/s44206-022-00020-4","DOIUrl":"https://doi.org/10.1007/s44206-022-00020-4","url":null,"abstract":"","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"109 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"85693392","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Students, Participatory Design, and Serious Games in a Response to: 'No Algorithmization without Representation: Pilot Study on Regulatory Experiments in an Exploratory Sandbox'.","authors":"Brian Ballsun-Stanton","doi":"10.1007/s44206-022-00024-0","DOIUrl":"https://doi.org/10.1007/s44206-022-00024-0","url":null,"abstract":"<p><p>No Algorithmization without Representation tracked a cohort of 'Lithopy' crypto-government sandbox participants in a longitudinal study looking at COVID-19 contact tracing app acceptance. These survey responses extended experiences with theoretical blockchain town governance by also tracking reasons for and against compliance with contact tracing apps. They found that the expressed opinions of students were incoherent and demanded technical or policy responses outside of the students' direct experiences. In this response to that paper, I leverage the paper's (commendable) open data to suggest that the sandbox's claims of 'No Algorithmization without Representation' is a rediscovery of participatory design within the context of the serious games movement. While Role-Playing Games and War Games are excellent pedagogic tools xor planning tools-using undergraduate students' participation in them as the basis of a claim for increased representation in technology policy is a bold claim. This claim is not fully substantiated by the paper's data. Nevertheless, there is a need for better decision-making and public representation within technology design and policy-making spaces-making the claim for serious games as a meaningful public policy contribution not without merit.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 3","pages":"23"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9593969/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"10778552","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Remy Hellstern, Daniel C Park, Victoria Lemieux, Guldana Salimjan
{"title":"Leveraging Blockchain-Based Archival Solutions for Sensitive Documentation: a Xinjiang Case Study.","authors":"Remy Hellstern, Daniel C Park, Victoria Lemieux, Guldana Salimjan","doi":"10.1007/s44206-022-00007-1","DOIUrl":"https://doi.org/10.1007/s44206-022-00007-1","url":null,"abstract":"<p><p>This exploratory research surveys scholarly literature on decentralized storage solutions, including theories and works of archival science, and similar applications in humanitarian contexts, to illustrate the necessity of these systems in Xinjiang Uyghur Autonomous Region in China. Xinjiang has recently shifted into the spotlight of the international press for allegations of abuse and forced labor, coercive cultural assimilation, and the creation of a police state. The leadership of the People's Republic of China (PRC) justifies the existence of these training facilities and expansive surveillance networks as part of the PRC-backed efforts to de-radicalize ethnic groups in the region. However, many governments and scholars rebuke these justifications, arguing that these centers are state-run facilities that house extrajudicially detained individuals based on their ethnic identity and religious belief. This paper aims at limiting the plausible deniability of violations conducive to cultural genocide, thus improving the prospects for deterrence and accountability through decentralized evidence management. The technological sophistication of the regime in Xinjiang is outpacing centralized systems and rendering storage solutions hosting evidence of these violations obsolete. This jeopardizes the prospect of truth and reconciliation in the future and allows the party to craft and disseminate their narrative globally with little resistance. Major findings focus on how decentralized systems can improve the streamlining and hosting of evidence regarding human rights violations occurring as well as advancing the study of cryptographic management of evidence regarding the treatment of vulnerable communities in low-rights regions.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 1","pages":"4"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9289082/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"33478442","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"No Algorithmization Without Representation: Pilot Study on Regulatory Experiments in an Exploratory Sandbox.","authors":"Denisa Reshef Kera, František Kalvas","doi":"10.1007/s44206-022-00002-6","DOIUrl":"10.1007/s44206-022-00002-6","url":null,"abstract":"<p><p>The exploratory sandbox for blockchain services, Lithopy, provided an experimental alternative to the aspirational frameworks and guidelines regulating algorithmic services ex post or ex ante. To understand the possibilities and limits of this experimental approach, we compared the regulatory expectations in the sandbox with the real-life decisions about an \"actual\" intrusive service: contact tracing application. We gathered feedback on hypothetical and real intrusive services from a group of 59 participants before and during the first and second waves of the COVID-19 pandemic in the Czech Republic (January, June 2020, and April 2021). Participants expressed support for interventions based on an independent rather than government oversight that increases participation and representation. Instead of reducing the regulations to code or insisting on strong regulations over the code, participants demanded hybrid combinations of code and regulations. We discuss this as a demand for \"no algorithmization without representation.\" The intrusive services act as new algorithmic \"territories,\" where the \"data\" settlers must redefine their sovereignty and agency on new grounds. They refuse to rely upon the existing institutions and promises of governance by design and seek tools that enable engagement in the full cycle of the design, implementation, and evaluation of the services. The sandboxes provide an environment that bridges the democratic deficit in the design of algorithmic services and their regulations.</p>","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"1 2","pages":"8"},"PeriodicalIF":0.0,"publicationDate":"2022-01-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.ncbi.nlm.nih.gov/pmc/articles/PMC9341158/pdf/","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"33511327","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Anna Felländer, J. Rebane, S. Larsson, M. Wiggberg, F. Heintz
{"title":"Achieving a Data-Driven Risk Assessment Methodology for Ethical AI","authors":"Anna Felländer, J. Rebane, S. Larsson, M. Wiggberg, F. Heintz","doi":"10.1007/s44206-022-00016-0","DOIUrl":"https://doi.org/10.1007/s44206-022-00016-0","url":null,"abstract":"","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"30 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2021-11-29","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"89484882","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"A Pedagogy of Ethical Interface Production Based on Virtue Ethics","authors":"J. Gallagher","doi":"10.4324/9780429266140-5","DOIUrl":"https://doi.org/10.4324/9780429266140-5","url":null,"abstract":"","PeriodicalId":72819,"journal":{"name":"Digital society : ethics, socio-legal and governance of digital technology","volume":"51 1","pages":""},"PeriodicalIF":0.0,"publicationDate":"2019-08-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"80801704","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":0,"RegionCategory":"","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}