{"title":"The protection of vulnerable algorithmic groups through collective data protection in the onlife world: A Brazilian perspective","authors":"Diego Machado","doi":"10.1016/j.clsr.2024.106027","DOIUrl":"10.1016/j.clsr.2024.106027","url":null,"abstract":"<div><p>The aim of this doctrinal legal study is to analyze the interplay between the vulnerability of groups in algorithmic systems and the protection of collective interests in data protection law in Brazil's legal system. Two research questions are raised: (i) Is the protection of personal data regulation applicable to data processing activities related to algorithmic groups? and (ii) can algorithmic groups be regarded as groups with vulnerability under the LGPD legal regime? This article is divided into three parts apart from the introduction, and combines three strands of research, namely group rights theory, vulnerability studies, and law and technology perspective. This combination is key to outline, in Sections 2 and 3, a theoretical framework that elucidates the concepts of collective data protection and group vulnerability mapping both onto the notion of algorithmic groups. Section 2 argues for the collective dimension of the right to the protection of personal data as the foundation of a collective data protection. Section 3, in turn, explores the conceptualization of group vulnerability and how this discourse resonates with algorithmic groups in the onlife world. I draw on vulnerability studies, and on Mireille Hildebrandt's law and technology perspective to delineate what do I mean by group vulnerability and how do I articulate theoretically this notion with algorithmic groups and the affordances of algorithmic systems. Section 4 examines the relation between collective data protection and vulnerability of algorithmic groups under the data protection legal framework in Brazil. To answer the research questions, the analysis is concentrated on three aspects of Brazilian data protection law: (i) the “collectivization of data protection”; (ii) the integration of group vulnerability in the data protection legal framework; (iii) data protection impact assessments in the context of LGPD's risk-based approach. The collective dimension of the right to personal data protection is increasingly recognized in Brazilian law through class-action litigation, particularly in the context of addressing vulnerabilities caused by new data-driven technologies. This collective dimension should guide courts and the Brazilian DPA in interpreting and applying the LGPD, especially Art. 12, § 2, regarding group data processing by algorithmic profiling systems. Data protection law in Brazil acknowledges that groups of data subjects may face vulnerability, requiring special protection and safeguards to mitigate risks and violations. Group vulnerability signals contexts deserving special attention and serves as a source of obligations and rights. Within LGPD's risk-based approach, mandatory DPIAs in ML-based algorithmic profiling systems help identify vulnerable groups and implement appropriate safeguards to mitigate risks of harm or rights violations. Non-compliance with safeguard implementation obligations should be considered a breach of Brazilian data protecti","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106027"},"PeriodicalIF":3.3,"publicationDate":"2024-08-02","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141961484","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The many features which make the eIDAS 2 Digital Wallet either risky or the ideal vehicle for the transition to post-quantum encryption","authors":"Giovanni Comandè , Margaret Varilek","doi":"10.1016/j.clsr.2024.106022","DOIUrl":"10.1016/j.clsr.2024.106022","url":null,"abstract":"<div><p>The amended Digital Identity Framework Regulation (“eIDAS 2″) is expected to be implemented by 2026, including its new solution of the Digital Identity Wallet from each Member State for its residents, citizens, and businesses. Widely used public key cryptosystems including those in the current EUDI Wallet prototypes are using electronic signatures and authentication that will need to be replaced by post-quantum resistant cryptography (PQC). In April 2024, the EU recommended general action by the Member States to prepare for quantum capability. We suggest that the European Digital Identity Wallet could be the starting point for an impactful debut of hybrid “quantum resistant” cryptography tools to align the Member States in the transition. We look at the awareness campaigns of ENISA and national cybersecurity authorities in the USA, Spain, UK and Germany on the transition to PQC using a hybrid approach. There seems to be some early consensus that NIST's PQC algorithms are likely to set the international standard. Given the eIDAS 2′s flexible, technologically neutral language, it allows the timely implementation of new secure encryption methods. The Wallet could be an exemplary model for large businesses, or app developers, and SMEs that also must transition to PQC to render secure those asymmetrically encrypted quantum-vulnerable digital assets. A very large and relatively fast uptake of the EUDI Wallet system is expected, and if it holds the promises of functionality, user friendliness, and security across the changing technological world, the EUDI Wallet's approach could become a benchmark for the transition to post-quantum capacity.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106022"},"PeriodicalIF":3.3,"publicationDate":"2024-08-01","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141961483","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"When non-consensual intimate deepfakes go viral: The insufficiency of the UK Online Safety Act","authors":"Beatriz Kira","doi":"10.1016/j.clsr.2024.106024","DOIUrl":"10.1016/j.clsr.2024.106024","url":null,"abstract":"<div><p>Advancements in artificial intelligence (AI) have drastically simplified the creation of synthetic media. While concerns often focus on potential misinformation harms, ‘non-consensual intimate deepfakes’ (NCID) – a form of image-based sexual abuse – pose a current, severe, and growing threat, disproportionately impacting women and girls. This article examines the measures implemented with the recently adopted Online Safety Act 2023 (OSA) and argues that the new criminal offences and the ‘systems and processes’ approach the law adopts are insufficient to counter NCID in the UK. This is because the OSA relies on platform policies that often lack consistency regarding synthetic media and on platforms’ content removal mechanisms which offer limited redress to victim-survivors after the harm has already occurred. The article argues that stronger prevention mechanisms are necessary and proposes that the law should mandate all AI-powered deepfake creation tools to ban the generation of intimate synthetic content and require the implementation of comprehensive and enforceable content moderation systems.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106024"},"PeriodicalIF":3.3,"publicationDate":"2024-07-27","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0267364924000906/pdfft?md5=e8c861b6693900d176a62ac2f6801b2e&pid=1-s2.0-S0267364924000906-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141954621","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"The blocking of Booking/Etraveli – When the first victim of EU's anti-US tech stand was a European","authors":"Dr. Christian Bergqvist","doi":"10.1016/j.clsr.2024.106025","DOIUrl":"10.1016/j.clsr.2024.106025","url":null,"abstract":"<div><p>It came somewhat unexpected when Dutch <em>Booking</em>'s acquisition of Swedish <em>Etraveli</em> was blocked in the EU as the parties operated in two separate segments of the online economy, hotel accommodation and flight booking, making the merger unproblematic under normal circumstances. However, in the digital economy, nothing is normal as enforcement has tightened, mostly vis-à-vis US tech giants but apparently also vis-à-vis European undertakings. Interestingly, customers' unwillingness to shop around for offers, as otherwise accepted by, e.g., the UK authority, played a role in the outcome. The decision has been challenged before the EU's General Court, providing a case to watch.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106025"},"PeriodicalIF":3.3,"publicationDate":"2024-07-26","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"https://www.sciencedirect.com/science/article/pii/S0267364924000918/pdfft?md5=988f2f479691439097c5872023c102cd&pid=1-s2.0-S0267364924000918-main.pdf","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141953090","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"OA","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Better alone than in bad company: Addressing the risks of companion chatbots through data protection by design","authors":"Pierre Dewitte","doi":"10.1016/j.clsr.2024.106019","DOIUrl":"10.1016/j.clsr.2024.106019","url":null,"abstract":"<div><p>Recent years have seen a surge in the development and use of companion chatbots, conversational agents specifically designed to act as virtual friends, romantic partners, life coaches or even therapists. Yet, these tools raise many concerns, especially when their target audience is comprised of vulnerable individuals. While the recently adopted AI Act is expected to address some of these concerns, both compliance and enforcement are bound to take time. Since the development of companion chatbots involves the processing of personal data at nearly every step of the process, from training to fine-tuning to deployment, this paper argues that the General Data Protection Regulation (“GDPR”), and data protection by design more specifically, already provides a solid ground for regulators and courts to force controllers to mitigate these risks. In doing so, it sheds light on the broad material scope of Articles 24(1) and 25(1) GDPR, highlights the role of these provisions as proxies to Fundamental Rights Impact Assessments (“FRIAs”), and peels off the many layers of personal data processing involved in the companion chatbots supply chain. That reasoning served as the basis for a complaint lodged with the Belgian data protection authority, the full text and supporting evidence of which are provided as supplementary materials.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106019"},"PeriodicalIF":3.3,"publicationDate":"2024-07-25","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141953089","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Núbia Augusto de Sousa Rocha , Alexandre Nascimento de Almeida , André Nunes , Humberto Angelo
{"title":"Critical points for the processing of personal data by the government: An empirical study in Brazil","authors":"Núbia Augusto de Sousa Rocha , Alexandre Nascimento de Almeida , André Nunes , Humberto Angelo","doi":"10.1016/j.clsr.2024.106023","DOIUrl":"10.1016/j.clsr.2024.106023","url":null,"abstract":"<div><p>The General Law for the Protection of Personal Data (LGPD), issued in Brazil in August 2018, establishes as one of the legal bases for the processing of personal data the execution of public policies by the State. A systematic review of the literature identified the existence of six critical points that represent challenges for public managers in the elaboration and implementation of policies that require the processing of personal data. The objective of this research is to establish the levels of criticality of the factors identified by the literature review, as well as to verify the existence of other critical points on which the literature has not yet advanced. To this end, a group of 11 specialists was selected to participate in the research that used the Delphi Method, a technique that consists of applying a set of questionnaires sequentially and individually, in order to establish a dialog between the participants and build a collective response. The results indicate a coherence between what was verified in the theory and the perception of the specialists. Another 10 critical points for the processing of personal data by the government were mentioned by the participants. In general, the main elements of tension identified addressed the lack of training of public officials and the sharing of personal data.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106023"},"PeriodicalIF":3.3,"publicationDate":"2024-07-24","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141951122","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
Luca Belli , Water B. Gaspar , Shilpa Singh Jaswant
{"title":"Data sovereignty and data transfers as fundamental elements of digital transformation: Lessons from the BRICS countries","authors":"Luca Belli , Water B. Gaspar , Shilpa Singh Jaswant","doi":"10.1016/j.clsr.2024.106017","DOIUrl":"10.1016/j.clsr.2024.106017","url":null,"abstract":"<div><p>When talking about digital transformation, data sovereignty considerations and data transfers cannot be excluded from the discussion, given the considerable likelihood that digital technologies deployed along the process collect, process and transfer (personal) data in multiple jurisdictions. An increasing number of nations, especially those within the BRICS grouping (Brazil, Russia, India, China, and South Africa) are developing their data governance and digital transformation approaches based on data sovereignty considerations, deeming specific types of data as key strategic and economic resources, which deserve particular protection and that must be leveraged for national development. From this perspective, this paper will try to shed light on how data sovereignty and data transfers interplay in the context of digital transformations. Particularly, we will consider the various dimensions that compose the concept of data sovereignty and will utilise a range of examples from the BRICS grouping to back some of the key considerations developed with empirical evidence. We define data sovereignty as the capacity to understand how and why (personal) data are processed and by whom, develop data processing capabilities, and effectively regulate data processing, thus retaining self-determination and control. We have chosen the BRICS grouping for three reasons. First, research on the grouping's data policies and digital transformation is still minimal despite their leading role. Second, BRICS account for over 40 % of the global population, or 3.2 billion people (which can be seen as 3.2 billion “data subjects” or data producers, depending on perspective, thus making them key players in data governance and digital transformation. Third, the BRICS members have realised that digital transformation is essential for the future of their economies and societies and have shaped specific data governance visions which must be considered by other countries, especially from the global majority, to understand why data governance is instrumental to foster thriving digital environments.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106017"},"PeriodicalIF":3.3,"publicationDate":"2024-07-23","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141960829","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Open Banking goes to Washington: Lessons from the EU on regulatory-driven data sharing regimes","authors":"Giuseppe Colangelo","doi":"10.1016/j.clsr.2024.106018","DOIUrl":"10.1016/j.clsr.2024.106018","url":null,"abstract":"<div><p>After representing the main country embracing a market-led approach to Open Banking, the U.S. is on the verge of switching to a regulatory-driven regime by mandating the sharing of financial data. Relying on the Section 1033 of the Dodd-Frank Act, the Consumer Financial Protection Bureau (CFPB) has, indeed, recently proposed a rulemaking on “Personal Financial Data Rights.” As the U.S. is, therefore, apparently following the EU, which has been at the forefront of the government-led Open Banking movement, the paper aims at analyzing the CFPB's proposal by taking stock of the EU experience. The review of the EU regulatory framework and its UK implementation provides useful insights about the functioning and challenging trade-offs of Open Banking, thus ultimately enabling us to assess whether the CFPB's proposal would provide significant added value for innovation and competition or would rather represent an unnecessary regulatory burden.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106018"},"PeriodicalIF":3.3,"publicationDate":"2024-07-17","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141637504","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Algorithmic proxy discrimination and its regulations","authors":"Xi Chen","doi":"10.1016/j.clsr.2024.106021","DOIUrl":"https://doi.org/10.1016/j.clsr.2024.106021","url":null,"abstract":"<div><p>As a specific type of algorithmic discrimination, algorithmic proxy discrimination (APD) exerts disparate impacts on legally protected groups because machine learning algorithms adopt facially neutral proxies to refer to legally protected features through their operational logic. Based on the relationship between sensitive feature data and the outcome of interest, APD can be classified as direct or indirect conductive. In the context of big data, the abundance and complexity of algorithmic proxy relations render APD inescapable and difficult to discern, while opaque algorithmic proxy relations impede the imputation of APD. Thus, as traditional antidiscrimination law strategies, such as blocking relevant data or disparate impact liability, are modeled on human decision-making and cannot effectively regulate APD. The paper proposes a regulatory framework targeting APD based on data and algorithmic aspects.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 106021"},"PeriodicalIF":3.3,"publicationDate":"2024-07-14","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141606221","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}
{"title":"Ontological models for representing image-based sexual abuses","authors":"Mattia Falduti, Cristine Griffo","doi":"10.1016/j.clsr.2024.105999","DOIUrl":"https://doi.org/10.1016/j.clsr.2024.105999","url":null,"abstract":"<div><p>In recent years, there has been extensive discourse on the moderation of abusive content online. Image-based Sexual Abuses (IBSAs) represent a type of abusive content that involves sexual images or videos. Platforms must moderate user-generated online content to tackle this issue effectively. One way to achieve this is by allowing users to report content, which can be flagged as abusive. In such instances, platforms may enforce their terms of service and prohibit certain types of content or users. Alongside these efforts, numerous countries have been making progress in defining and regulating this subject by implementing dedicated regulations. However, national solutions alone are insufficient for addressing a constantly increasing global emergency. Consequently, digital platforms create their own definitions of abusive conduct to overcome obstacles arising from conflicting national laws. In this paper, we use an ontological approach to model two types of abusive behavior. To do this, we applied the UFO-L patterns to build ontological models and based them on a top-level ontology, the Unified Foundational Ontology (UFO). The outcome is a set of ontological models that digital platforms can use to monitor and manage user compliance with the service provider’s code of conduct.</p></div>","PeriodicalId":51516,"journal":{"name":"Computer Law & Security Review","volume":"54 ","pages":"Article 105999"},"PeriodicalIF":3.3,"publicationDate":"2024-07-06","publicationTypes":"Journal Article","fieldsOfStudy":null,"isOpenAccess":false,"openAccessPdf":"","citationCount":null,"resultStr":null,"platform":"Semanticscholar","paperid":"141593645","PeriodicalName":null,"FirstCategoryId":null,"ListUrlMain":null,"RegionNum":3,"RegionCategory":"社会学","ArticlePicture":[],"TitleCN":null,"AbstractTextCN":null,"PMCID":"","EPubDate":null,"PubModel":null,"JCR":null,"JCRName":null,"Score":null,"Total":0}